Jan 26 11:50:42 np0005596062 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 26 11:50:42 np0005596062 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 26 11:50:42 np0005596062 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 11:50:42 np0005596062 kernel: BIOS-provided physical RAM map:
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 26 11:50:42 np0005596062 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 26 11:50:42 np0005596062 kernel: NX (Execute Disable) protection: active
Jan 26 11:50:42 np0005596062 kernel: APIC: Static calls initialized
Jan 26 11:50:42 np0005596062 kernel: SMBIOS 2.8 present.
Jan 26 11:50:42 np0005596062 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 26 11:50:42 np0005596062 kernel: Hypervisor detected: KVM
Jan 26 11:50:42 np0005596062 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 26 11:50:42 np0005596062 kernel: kvm-clock: using sched offset of 5012268291 cycles
Jan 26 11:50:42 np0005596062 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 26 11:50:42 np0005596062 kernel: tsc: Detected 2800.000 MHz processor
Jan 26 11:50:42 np0005596062 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 26 11:50:42 np0005596062 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 26 11:50:42 np0005596062 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 26 11:50:42 np0005596062 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 26 11:50:42 np0005596062 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 26 11:50:42 np0005596062 kernel: Using GB pages for direct mapping
Jan 26 11:50:42 np0005596062 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 26 11:50:42 np0005596062 kernel: ACPI: Early table checksum verification disabled
Jan 26 11:50:42 np0005596062 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 26 11:50:42 np0005596062 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 11:50:42 np0005596062 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 11:50:42 np0005596062 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 11:50:42 np0005596062 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 26 11:50:42 np0005596062 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 11:50:42 np0005596062 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 11:50:42 np0005596062 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 26 11:50:42 np0005596062 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 26 11:50:42 np0005596062 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 26 11:50:42 np0005596062 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 26 11:50:42 np0005596062 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 26 11:50:42 np0005596062 kernel: No NUMA configuration found
Jan 26 11:50:42 np0005596062 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 26 11:50:42 np0005596062 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 26 11:50:42 np0005596062 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 26 11:50:42 np0005596062 kernel: Zone ranges:
Jan 26 11:50:42 np0005596062 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 26 11:50:42 np0005596062 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 26 11:50:42 np0005596062 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 11:50:42 np0005596062 kernel:  Device   empty
Jan 26 11:50:42 np0005596062 kernel: Movable zone start for each node
Jan 26 11:50:42 np0005596062 kernel: Early memory node ranges
Jan 26 11:50:42 np0005596062 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 26 11:50:42 np0005596062 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 26 11:50:42 np0005596062 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 11:50:42 np0005596062 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 26 11:50:42 np0005596062 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 26 11:50:42 np0005596062 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 26 11:50:42 np0005596062 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 26 11:50:42 np0005596062 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 26 11:50:42 np0005596062 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 26 11:50:42 np0005596062 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 26 11:50:42 np0005596062 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 26 11:50:42 np0005596062 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 26 11:50:42 np0005596062 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 26 11:50:42 np0005596062 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 26 11:50:42 np0005596062 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 26 11:50:42 np0005596062 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 26 11:50:42 np0005596062 kernel: TSC deadline timer available
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Max. logical packages:   8
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Max. logical dies:       8
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Max. dies per package:   1
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Max. threads per core:   1
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Num. cores per package:     1
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Num. threads per package:   1
Jan 26 11:50:42 np0005596062 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 26 11:50:42 np0005596062 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 26 11:50:42 np0005596062 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 26 11:50:42 np0005596062 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 26 11:50:42 np0005596062 kernel: Booting paravirtualized kernel on KVM
Jan 26 11:50:42 np0005596062 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 26 11:50:42 np0005596062 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 26 11:50:42 np0005596062 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 26 11:50:42 np0005596062 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 26 11:50:42 np0005596062 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 11:50:42 np0005596062 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 26 11:50:42 np0005596062 kernel: random: crng init done
Jan 26 11:50:42 np0005596062 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: Fallback order for Node 0: 0 
Jan 26 11:50:42 np0005596062 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 26 11:50:42 np0005596062 kernel: Policy zone: Normal
Jan 26 11:50:42 np0005596062 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 26 11:50:42 np0005596062 kernel: software IO TLB: area num 8.
Jan 26 11:50:42 np0005596062 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 26 11:50:42 np0005596062 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 26 11:50:42 np0005596062 kernel: ftrace: allocated 194 pages with 3 groups
Jan 26 11:50:42 np0005596062 kernel: Dynamic Preempt: voluntary
Jan 26 11:50:42 np0005596062 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 26 11:50:42 np0005596062 kernel: rcu: #011RCU event tracing is enabled.
Jan 26 11:50:42 np0005596062 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 26 11:50:42 np0005596062 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 26 11:50:42 np0005596062 kernel: #011Rude variant of Tasks RCU enabled.
Jan 26 11:50:42 np0005596062 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 26 11:50:42 np0005596062 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 26 11:50:42 np0005596062 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 26 11:50:42 np0005596062 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 11:50:42 np0005596062 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 11:50:42 np0005596062 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 11:50:42 np0005596062 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 26 11:50:42 np0005596062 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 26 11:50:42 np0005596062 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 26 11:50:42 np0005596062 kernel: Console: colour VGA+ 80x25
Jan 26 11:50:42 np0005596062 kernel: printk: console [ttyS0] enabled
Jan 26 11:50:42 np0005596062 kernel: ACPI: Core revision 20230331
Jan 26 11:50:42 np0005596062 kernel: APIC: Switch to symmetric I/O mode setup
Jan 26 11:50:42 np0005596062 kernel: x2apic enabled
Jan 26 11:50:42 np0005596062 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 26 11:50:42 np0005596062 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 26 11:50:42 np0005596062 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 26 11:50:42 np0005596062 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 26 11:50:42 np0005596062 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 26 11:50:42 np0005596062 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 26 11:50:42 np0005596062 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 26 11:50:42 np0005596062 kernel: Spectre V2 : Mitigation: Retpolines
Jan 26 11:50:42 np0005596062 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 26 11:50:42 np0005596062 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 26 11:50:42 np0005596062 kernel: RETBleed: Mitigation: untrained return thunk
Jan 26 11:50:42 np0005596062 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 26 11:50:42 np0005596062 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 26 11:50:42 np0005596062 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 26 11:50:42 np0005596062 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 26 11:50:42 np0005596062 kernel: x86/bugs: return thunk changed
Jan 26 11:50:42 np0005596062 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 26 11:50:42 np0005596062 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 26 11:50:42 np0005596062 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 26 11:50:42 np0005596062 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 26 11:50:42 np0005596062 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 26 11:50:42 np0005596062 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 26 11:50:42 np0005596062 kernel: Freeing SMP alternatives memory: 40K
Jan 26 11:50:42 np0005596062 kernel: pid_max: default: 32768 minimum: 301
Jan 26 11:50:42 np0005596062 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 26 11:50:42 np0005596062 kernel: landlock: Up and running.
Jan 26 11:50:42 np0005596062 kernel: Yama: becoming mindful.
Jan 26 11:50:42 np0005596062 kernel: SELinux:  Initializing.
Jan 26 11:50:42 np0005596062 kernel: LSM support for eBPF active
Jan 26 11:50:42 np0005596062 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 26 11:50:42 np0005596062 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 26 11:50:42 np0005596062 kernel: ... version:                0
Jan 26 11:50:42 np0005596062 kernel: ... bit width:              48
Jan 26 11:50:42 np0005596062 kernel: ... generic registers:      6
Jan 26 11:50:42 np0005596062 kernel: ... value mask:             0000ffffffffffff
Jan 26 11:50:42 np0005596062 kernel: ... max period:             00007fffffffffff
Jan 26 11:50:42 np0005596062 kernel: ... fixed-purpose events:   0
Jan 26 11:50:42 np0005596062 kernel: ... event mask:             000000000000003f
Jan 26 11:50:42 np0005596062 kernel: signal: max sigframe size: 1776
Jan 26 11:50:42 np0005596062 kernel: rcu: Hierarchical SRCU implementation.
Jan 26 11:50:42 np0005596062 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 26 11:50:42 np0005596062 kernel: smp: Bringing up secondary CPUs ...
Jan 26 11:50:42 np0005596062 kernel: smpboot: x86: Booting SMP configuration:
Jan 26 11:50:42 np0005596062 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 26 11:50:42 np0005596062 kernel: smp: Brought up 1 node, 8 CPUs
Jan 26 11:50:42 np0005596062 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 26 11:50:42 np0005596062 kernel: node 0 deferred pages initialised in 9ms
Jan 26 11:50:42 np0005596062 kernel: Memory: 7763956K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618364K reserved, 0K cma-reserved)
Jan 26 11:50:42 np0005596062 kernel: devtmpfs: initialized
Jan 26 11:50:42 np0005596062 kernel: x86/mm: Memory block size: 128MB
Jan 26 11:50:42 np0005596062 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 26 11:50:42 np0005596062 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 26 11:50:42 np0005596062 kernel: pinctrl core: initialized pinctrl subsystem
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 26 11:50:42 np0005596062 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 26 11:50:42 np0005596062 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 26 11:50:42 np0005596062 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 26 11:50:42 np0005596062 kernel: audit: initializing netlink subsys (disabled)
Jan 26 11:50:42 np0005596062 kernel: audit: type=2000 audit(1769446239.824:1): state=initialized audit_enabled=0 res=1
Jan 26 11:50:42 np0005596062 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 26 11:50:42 np0005596062 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 26 11:50:42 np0005596062 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 26 11:50:42 np0005596062 kernel: cpuidle: using governor menu
Jan 26 11:50:42 np0005596062 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 26 11:50:42 np0005596062 kernel: PCI: Using configuration type 1 for base access
Jan 26 11:50:42 np0005596062 kernel: PCI: Using configuration type 1 for extended access
Jan 26 11:50:42 np0005596062 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 26 11:50:42 np0005596062 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 26 11:50:42 np0005596062 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 26 11:50:42 np0005596062 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 26 11:50:42 np0005596062 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 26 11:50:42 np0005596062 kernel: Demotion targets for Node 0: null
Jan 26 11:50:42 np0005596062 kernel: cryptd: max_cpu_qlen set to 1000
Jan 26 11:50:42 np0005596062 kernel: ACPI: Added _OSI(Module Device)
Jan 26 11:50:42 np0005596062 kernel: ACPI: Added _OSI(Processor Device)
Jan 26 11:50:42 np0005596062 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 26 11:50:42 np0005596062 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 26 11:50:42 np0005596062 kernel: ACPI: Interpreter enabled
Jan 26 11:50:42 np0005596062 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 26 11:50:42 np0005596062 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 26 11:50:42 np0005596062 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 26 11:50:42 np0005596062 kernel: PCI: Using E820 reservations for host bridge windows
Jan 26 11:50:42 np0005596062 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 26 11:50:42 np0005596062 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 26 11:50:42 np0005596062 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [3] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [4] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [5] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [6] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [7] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [8] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [9] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [10] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [11] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [12] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [13] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [14] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [15] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [16] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [17] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [18] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [19] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [20] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [21] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [22] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [23] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [24] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [25] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [26] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [27] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [28] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [29] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [30] registered
Jan 26 11:50:42 np0005596062 kernel: acpiphp: Slot [31] registered
Jan 26 11:50:42 np0005596062 kernel: PCI host bridge to bus 0000:00
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 26 11:50:42 np0005596062 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 26 11:50:42 np0005596062 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 26 11:50:42 np0005596062 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 26 11:50:42 np0005596062 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 26 11:50:42 np0005596062 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 26 11:50:42 np0005596062 kernel: iommu: Default domain type: Translated
Jan 26 11:50:42 np0005596062 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 26 11:50:42 np0005596062 kernel: SCSI subsystem initialized
Jan 26 11:50:42 np0005596062 kernel: ACPI: bus type USB registered
Jan 26 11:50:42 np0005596062 kernel: usbcore: registered new interface driver usbfs
Jan 26 11:50:42 np0005596062 kernel: usbcore: registered new interface driver hub
Jan 26 11:50:42 np0005596062 kernel: usbcore: registered new device driver usb
Jan 26 11:50:42 np0005596062 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 26 11:50:42 np0005596062 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 26 11:50:42 np0005596062 kernel: PTP clock support registered
Jan 26 11:50:42 np0005596062 kernel: EDAC MC: Ver: 3.0.0
Jan 26 11:50:42 np0005596062 kernel: NetLabel: Initializing
Jan 26 11:50:42 np0005596062 kernel: NetLabel:  domain hash size = 128
Jan 26 11:50:42 np0005596062 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 26 11:50:42 np0005596062 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 26 11:50:42 np0005596062 kernel: PCI: Using ACPI for IRQ routing
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 26 11:50:42 np0005596062 kernel: vgaarb: loaded
Jan 26 11:50:42 np0005596062 kernel: clocksource: Switched to clocksource kvm-clock
Jan 26 11:50:42 np0005596062 kernel: VFS: Disk quotas dquot_6.6.0
Jan 26 11:50:42 np0005596062 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 26 11:50:42 np0005596062 kernel: pnp: PnP ACPI init
Jan 26 11:50:42 np0005596062 kernel: pnp: PnP ACPI: found 5 devices
Jan 26 11:50:42 np0005596062 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_INET protocol family
Jan 26 11:50:42 np0005596062 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 26 11:50:42 np0005596062 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_XDP protocol family
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 26 11:50:42 np0005596062 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 26 11:50:42 np0005596062 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 26 11:50:42 np0005596062 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71965 usecs
Jan 26 11:50:42 np0005596062 kernel: PCI: CLS 0 bytes, default 64
Jan 26 11:50:42 np0005596062 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 26 11:50:42 np0005596062 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 26 11:50:42 np0005596062 kernel: ACPI: bus type thunderbolt registered
Jan 26 11:50:42 np0005596062 kernel: Trying to unpack rootfs image as initramfs...
Jan 26 11:50:42 np0005596062 kernel: Initialise system trusted keyrings
Jan 26 11:50:42 np0005596062 kernel: Key type blacklist registered
Jan 26 11:50:42 np0005596062 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 26 11:50:42 np0005596062 kernel: zbud: loaded
Jan 26 11:50:42 np0005596062 kernel: integrity: Platform Keyring initialized
Jan 26 11:50:42 np0005596062 kernel: integrity: Machine keyring initialized
Jan 26 11:50:42 np0005596062 kernel: Freeing initrd memory: 87956K
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_ALG protocol family
Jan 26 11:50:42 np0005596062 kernel: xor: automatically using best checksumming function   avx       
Jan 26 11:50:42 np0005596062 kernel: Key type asymmetric registered
Jan 26 11:50:42 np0005596062 kernel: Asymmetric key parser 'x509' registered
Jan 26 11:50:42 np0005596062 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 26 11:50:42 np0005596062 kernel: io scheduler mq-deadline registered
Jan 26 11:50:42 np0005596062 kernel: io scheduler kyber registered
Jan 26 11:50:42 np0005596062 kernel: io scheduler bfq registered
Jan 26 11:50:42 np0005596062 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 26 11:50:42 np0005596062 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 26 11:50:42 np0005596062 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 26 11:50:42 np0005596062 kernel: ACPI: button: Power Button [PWRF]
Jan 26 11:50:42 np0005596062 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 26 11:50:42 np0005596062 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 26 11:50:42 np0005596062 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 26 11:50:42 np0005596062 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 26 11:50:42 np0005596062 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 26 11:50:42 np0005596062 kernel: Non-volatile memory driver v1.3
Jan 26 11:50:42 np0005596062 kernel: rdac: device handler registered
Jan 26 11:50:42 np0005596062 kernel: hp_sw: device handler registered
Jan 26 11:50:42 np0005596062 kernel: emc: device handler registered
Jan 26 11:50:42 np0005596062 kernel: alua: device handler registered
Jan 26 11:50:42 np0005596062 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 26 11:50:42 np0005596062 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 26 11:50:42 np0005596062 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 26 11:50:42 np0005596062 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 26 11:50:42 np0005596062 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 26 11:50:42 np0005596062 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 26 11:50:42 np0005596062 kernel: usb usb1: Product: UHCI Host Controller
Jan 26 11:50:42 np0005596062 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 26 11:50:42 np0005596062 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 26 11:50:42 np0005596062 kernel: hub 1-0:1.0: USB hub found
Jan 26 11:50:42 np0005596062 kernel: hub 1-0:1.0: 2 ports detected
Jan 26 11:50:42 np0005596062 kernel: usbcore: registered new interface driver usbserial_generic
Jan 26 11:50:42 np0005596062 kernel: usbserial: USB Serial support registered for generic
Jan 26 11:50:42 np0005596062 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 26 11:50:42 np0005596062 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 26 11:50:42 np0005596062 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 26 11:50:42 np0005596062 kernel: mousedev: PS/2 mouse device common for all mice
Jan 26 11:50:42 np0005596062 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 26 11:50:42 np0005596062 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 26 11:50:42 np0005596062 kernel: rtc_cmos 00:04: registered as rtc0
Jan 26 11:50:42 np0005596062 kernel: rtc_cmos 00:04: setting system clock to 2026-01-26T16:50:41 UTC (1769446241)
Jan 26 11:50:42 np0005596062 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 26 11:50:42 np0005596062 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 26 11:50:42 np0005596062 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 26 11:50:42 np0005596062 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 26 11:50:42 np0005596062 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 26 11:50:42 np0005596062 kernel: usbcore: registered new interface driver usbhid
Jan 26 11:50:42 np0005596062 kernel: usbhid: USB HID core driver
Jan 26 11:50:42 np0005596062 kernel: drop_monitor: Initializing network drop monitor service
Jan 26 11:50:42 np0005596062 kernel: Initializing XFRM netlink socket
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_INET6 protocol family
Jan 26 11:50:42 np0005596062 kernel: Segment Routing with IPv6
Jan 26 11:50:42 np0005596062 kernel: NET: Registered PF_PACKET protocol family
Jan 26 11:50:42 np0005596062 kernel: mpls_gso: MPLS GSO support
Jan 26 11:50:42 np0005596062 kernel: IPI shorthand broadcast: enabled
Jan 26 11:50:42 np0005596062 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 26 11:50:42 np0005596062 kernel: AES CTR mode by8 optimization enabled
Jan 26 11:50:42 np0005596062 kernel: sched_clock: Marking stable (1818002010, 160222240)->(2074603120, -96378870)
Jan 26 11:50:42 np0005596062 kernel: registered taskstats version 1
Jan 26 11:50:42 np0005596062 kernel: Loading compiled-in X.509 certificates
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 26 11:50:42 np0005596062 kernel: Demotion targets for Node 0: null
Jan 26 11:50:42 np0005596062 kernel: page_owner is disabled
Jan 26 11:50:42 np0005596062 kernel: Key type .fscrypt registered
Jan 26 11:50:42 np0005596062 kernel: Key type fscrypt-provisioning registered
Jan 26 11:50:42 np0005596062 kernel: Key type big_key registered
Jan 26 11:50:42 np0005596062 kernel: Key type encrypted registered
Jan 26 11:50:42 np0005596062 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 26 11:50:42 np0005596062 kernel: Loading compiled-in module X.509 certificates
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 11:50:42 np0005596062 kernel: ima: Allocated hash algorithm: sha256
Jan 26 11:50:42 np0005596062 kernel: ima: No architecture policies found
Jan 26 11:50:42 np0005596062 kernel: evm: Initialising EVM extended attributes:
Jan 26 11:50:42 np0005596062 kernel: evm: security.selinux
Jan 26 11:50:42 np0005596062 kernel: evm: security.SMACK64 (disabled)
Jan 26 11:50:42 np0005596062 kernel: evm: security.SMACK64EXEC (disabled)
Jan 26 11:50:42 np0005596062 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 26 11:50:42 np0005596062 kernel: evm: security.SMACK64MMAP (disabled)
Jan 26 11:50:42 np0005596062 kernel: evm: security.apparmor (disabled)
Jan 26 11:50:42 np0005596062 kernel: evm: security.ima
Jan 26 11:50:42 np0005596062 kernel: evm: security.capability
Jan 26 11:50:42 np0005596062 kernel: evm: HMAC attrs: 0x1
Jan 26 11:50:42 np0005596062 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 26 11:50:42 np0005596062 kernel: Running certificate verification RSA selftest
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 26 11:50:42 np0005596062 kernel: Running certificate verification ECDSA selftest
Jan 26 11:50:42 np0005596062 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 26 11:50:42 np0005596062 kernel: clk: Disabling unused clocks
Jan 26 11:50:42 np0005596062 kernel: Freeing unused decrypted memory: 2028K
Jan 26 11:50:42 np0005596062 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 26 11:50:42 np0005596062 kernel: Write protecting the kernel read-only data: 30720k
Jan 26 11:50:42 np0005596062 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 26 11:50:42 np0005596062 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 26 11:50:42 np0005596062 kernel: Run /init as init process
Jan 26 11:50:42 np0005596062 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 11:50:42 np0005596062 systemd: Detected virtualization kvm.
Jan 26 11:50:42 np0005596062 systemd: Detected architecture x86-64.
Jan 26 11:50:42 np0005596062 systemd: Running in initrd.
Jan 26 11:50:42 np0005596062 systemd: No hostname configured, using default hostname.
Jan 26 11:50:42 np0005596062 systemd: Hostname set to <localhost>.
Jan 26 11:50:42 np0005596062 systemd: Initializing machine ID from VM UUID.
Jan 26 11:50:42 np0005596062 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 26 11:50:42 np0005596062 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 26 11:50:42 np0005596062 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 26 11:50:42 np0005596062 kernel: usb 1-1: Manufacturer: QEMU
Jan 26 11:50:42 np0005596062 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 26 11:50:42 np0005596062 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 26 11:50:42 np0005596062 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 26 11:50:42 np0005596062 systemd: Queued start job for default target Initrd Default Target.
Jan 26 11:50:42 np0005596062 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 11:50:42 np0005596062 systemd: Reached target Local Encrypted Volumes.
Jan 26 11:50:42 np0005596062 systemd: Reached target Initrd /usr File System.
Jan 26 11:50:42 np0005596062 systemd: Reached target Local File Systems.
Jan 26 11:50:42 np0005596062 systemd: Reached target Path Units.
Jan 26 11:50:42 np0005596062 systemd: Reached target Slice Units.
Jan 26 11:50:42 np0005596062 systemd: Reached target Swaps.
Jan 26 11:50:42 np0005596062 systemd: Reached target Timer Units.
Jan 26 11:50:42 np0005596062 systemd: Listening on D-Bus System Message Bus Socket.
Jan 26 11:50:42 np0005596062 systemd: Listening on Journal Socket (/dev/log).
Jan 26 11:50:42 np0005596062 systemd: Listening on Journal Socket.
Jan 26 11:50:42 np0005596062 systemd: Listening on udev Control Socket.
Jan 26 11:50:42 np0005596062 systemd: Listening on udev Kernel Socket.
Jan 26 11:50:42 np0005596062 systemd: Reached target Socket Units.
Jan 26 11:50:42 np0005596062 systemd: Starting Create List of Static Device Nodes...
Jan 26 11:50:42 np0005596062 systemd: Starting Journal Service...
Jan 26 11:50:42 np0005596062 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 11:50:42 np0005596062 systemd: Starting Apply Kernel Variables...
Jan 26 11:50:42 np0005596062 systemd: Starting Create System Users...
Jan 26 11:50:42 np0005596062 systemd: Starting Setup Virtual Console...
Jan 26 11:50:42 np0005596062 systemd: Finished Create List of Static Device Nodes.
Jan 26 11:50:42 np0005596062 systemd: Finished Apply Kernel Variables.
Jan 26 11:50:42 np0005596062 systemd: Finished Create System Users.
Jan 26 11:50:42 np0005596062 systemd-journald[305]: Journal started
Jan 26 11:50:42 np0005596062 systemd-journald[305]: Runtime Journal (/run/log/journal/5c33c4b014ac46af8c94d3bb1b6300af) is 8.0M, max 153.6M, 145.6M free.
Jan 26 11:50:42 np0005596062 systemd-sysusers[309]: Creating group 'users' with GID 100.
Jan 26 11:50:42 np0005596062 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Jan 26 11:50:42 np0005596062 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 26 11:50:42 np0005596062 systemd: Started Journal Service.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 11:50:42 np0005596062 systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 11:50:42 np0005596062 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 11:50:42 np0005596062 systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 11:50:42 np0005596062 systemd[1]: Finished Setup Virtual Console.
Jan 26 11:50:42 np0005596062 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting dracut cmdline hook...
Jan 26 11:50:42 np0005596062 dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Jan 26 11:50:42 np0005596062 dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 11:50:42 np0005596062 systemd[1]: Finished dracut cmdline hook.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting dracut pre-udev hook...
Jan 26 11:50:42 np0005596062 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 26 11:50:42 np0005596062 kernel: device-mapper: uevent: version 1.0.3
Jan 26 11:50:42 np0005596062 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 26 11:50:42 np0005596062 kernel: RPC: Registered named UNIX socket transport module.
Jan 26 11:50:42 np0005596062 kernel: RPC: Registered udp transport module.
Jan 26 11:50:42 np0005596062 kernel: RPC: Registered tcp transport module.
Jan 26 11:50:42 np0005596062 kernel: RPC: Registered tcp-with-tls transport module.
Jan 26 11:50:42 np0005596062 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 26 11:50:42 np0005596062 rpc.statd[439]: Version 2.5.4 starting
Jan 26 11:50:42 np0005596062 rpc.statd[439]: Initializing NSM state
Jan 26 11:50:42 np0005596062 rpc.idmapd[444]: Setting log level to 0
Jan 26 11:50:42 np0005596062 systemd[1]: Finished dracut pre-udev hook.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 11:50:42 np0005596062 systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 11:50:42 np0005596062 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting dracut pre-trigger hook...
Jan 26 11:50:42 np0005596062 systemd[1]: Finished dracut pre-trigger hook.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting Coldplug All udev Devices...
Jan 26 11:50:42 np0005596062 systemd[1]: Created slice Slice /system/modprobe.
Jan 26 11:50:42 np0005596062 systemd[1]: Starting Load Kernel Module configfs...
Jan 26 11:50:42 np0005596062 systemd[1]: Finished Coldplug All udev Devices.
Jan 26 11:50:42 np0005596062 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 11:50:42 np0005596062 systemd[1]: Finished Load Kernel Module configfs.
Jan 26 11:50:42 np0005596062 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 11:50:42 np0005596062 systemd[1]: Reached target Network.
Jan 26 11:50:42 np0005596062 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 11:50:42 np0005596062 systemd[1]: Starting dracut initqueue hook...
Jan 26 11:50:42 np0005596062 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 26 11:50:42 np0005596062 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 26 11:50:42 np0005596062 kernel: vda: vda1
Jan 26 11:50:42 np0005596062 systemd-udevd[458]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 11:50:43 np0005596062 kernel: scsi host0: ata_piix
Jan 26 11:50:43 np0005596062 kernel: scsi host1: ata_piix
Jan 26 11:50:43 np0005596062 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 26 11:50:43 np0005596062 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 26 11:50:43 np0005596062 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 11:50:43 np0005596062 systemd[1]: Reached target Initrd Root Device.
Jan 26 11:50:43 np0005596062 systemd[1]: Mounting Kernel Configuration File System...
Jan 26 11:50:43 np0005596062 systemd[1]: Mounted Kernel Configuration File System.
Jan 26 11:50:43 np0005596062 systemd[1]: Reached target System Initialization.
Jan 26 11:50:43 np0005596062 systemd[1]: Reached target Basic System.
Jan 26 11:50:43 np0005596062 kernel: ata1: found unknown device (class 0)
Jan 26 11:50:43 np0005596062 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 26 11:50:43 np0005596062 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 26 11:50:43 np0005596062 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 26 11:50:43 np0005596062 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 26 11:50:43 np0005596062 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 26 11:50:43 np0005596062 systemd[1]: Finished dracut initqueue hook.
Jan 26 11:50:43 np0005596062 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 11:50:43 np0005596062 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 26 11:50:43 np0005596062 systemd[1]: Reached target Remote File Systems.
Jan 26 11:50:43 np0005596062 systemd[1]: Starting dracut pre-mount hook...
Jan 26 11:50:43 np0005596062 systemd[1]: Finished dracut pre-mount hook.
Jan 26 11:50:43 np0005596062 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 26 11:50:43 np0005596062 systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 26 11:50:43 np0005596062 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 11:50:43 np0005596062 systemd[1]: Mounting /sysroot...
Jan 26 11:50:44 np0005596062 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 26 11:50:44 np0005596062 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 26 11:50:44 np0005596062 kernel: XFS (vda1): Ending clean mount
Jan 26 11:50:44 np0005596062 systemd[1]: Mounted /sysroot.
Jan 26 11:50:44 np0005596062 systemd[1]: Reached target Initrd Root File System.
Jan 26 11:50:44 np0005596062 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 26 11:50:44 np0005596062 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 26 11:50:44 np0005596062 systemd[1]: Reached target Initrd File Systems.
Jan 26 11:50:44 np0005596062 systemd[1]: Reached target Initrd Default Target.
Jan 26 11:50:44 np0005596062 systemd[1]: Starting dracut mount hook...
Jan 26 11:50:44 np0005596062 systemd[1]: Finished dracut mount hook.
Jan 26 11:50:44 np0005596062 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 26 11:50:44 np0005596062 rpc.idmapd[444]: exiting on signal 15
Jan 26 11:50:44 np0005596062 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 26 11:50:44 np0005596062 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Network.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Timer Units.
Jan 26 11:50:44 np0005596062 systemd[1]: dbus.socket: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Initrd Default Target.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Basic System.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Initrd Root Device.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Initrd /usr File System.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Path Units.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Remote File Systems.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Slice Units.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Socket Units.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target System Initialization.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Local File Systems.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Swaps.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut mount hook.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut pre-mount hook.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut initqueue hook.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Coldplug All udev Devices.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut pre-trigger hook.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Setup Virtual Console.
Jan 26 11:50:44 np0005596062 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Closed udev Control Socket.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Closed udev Kernel Socket.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut pre-udev hook.
Jan 26 11:50:44 np0005596062 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped dracut cmdline hook.
Jan 26 11:50:44 np0005596062 systemd[1]: Starting Cleanup udev Database...
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 26 11:50:44 np0005596062 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 26 11:50:44 np0005596062 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Stopped Create System Users.
Jan 26 11:50:44 np0005596062 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 26 11:50:44 np0005596062 systemd[1]: Finished Cleanup udev Database.
Jan 26 11:50:44 np0005596062 systemd[1]: Reached target Switch Root.
Jan 26 11:50:44 np0005596062 systemd[1]: Starting Switch Root...
Jan 26 11:50:44 np0005596062 systemd[1]: Switching root.
Jan 26 11:50:44 np0005596062 systemd-journald[305]: Journal stopped
Jan 26 11:50:45 np0005596062 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 26 11:50:45 np0005596062 kernel: audit: type=1404 audit(1769446244.780:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 11:50:45 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 11:50:45 np0005596062 kernel: audit: type=1403 audit(1769446244.962:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 26 11:50:45 np0005596062 systemd: Successfully loaded SELinux policy in 186.253ms.
Jan 26 11:50:45 np0005596062 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 35.695ms.
Jan 26 11:50:45 np0005596062 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 11:50:45 np0005596062 systemd: Detected virtualization kvm.
Jan 26 11:50:45 np0005596062 systemd: Detected architecture x86-64.
Jan 26 11:50:45 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 11:50:45 np0005596062 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Stopped Switch Root.
Jan 26 11:50:45 np0005596062 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 26 11:50:45 np0005596062 systemd: Created slice Slice /system/getty.
Jan 26 11:50:45 np0005596062 systemd: Created slice Slice /system/serial-getty.
Jan 26 11:50:45 np0005596062 systemd: Created slice Slice /system/sshd-keygen.
Jan 26 11:50:45 np0005596062 systemd: Created slice User and Session Slice.
Jan 26 11:50:45 np0005596062 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 11:50:45 np0005596062 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 26 11:50:45 np0005596062 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 26 11:50:45 np0005596062 systemd: Reached target Local Encrypted Volumes.
Jan 26 11:50:45 np0005596062 systemd: Stopped target Switch Root.
Jan 26 11:50:45 np0005596062 systemd: Stopped target Initrd File Systems.
Jan 26 11:50:45 np0005596062 systemd: Stopped target Initrd Root File System.
Jan 26 11:50:45 np0005596062 systemd: Reached target Local Integrity Protected Volumes.
Jan 26 11:50:45 np0005596062 systemd: Reached target Path Units.
Jan 26 11:50:45 np0005596062 systemd: Reached target rpc_pipefs.target.
Jan 26 11:50:45 np0005596062 systemd: Reached target Slice Units.
Jan 26 11:50:45 np0005596062 systemd: Reached target Swaps.
Jan 26 11:50:45 np0005596062 systemd: Reached target Local Verity Protected Volumes.
Jan 26 11:50:45 np0005596062 systemd: Listening on RPCbind Server Activation Socket.
Jan 26 11:50:45 np0005596062 systemd: Reached target RPC Port Mapper.
Jan 26 11:50:45 np0005596062 systemd: Listening on Process Core Dump Socket.
Jan 26 11:50:45 np0005596062 systemd: Listening on initctl Compatibility Named Pipe.
Jan 26 11:50:45 np0005596062 systemd: Listening on udev Control Socket.
Jan 26 11:50:45 np0005596062 systemd: Listening on udev Kernel Socket.
Jan 26 11:50:45 np0005596062 systemd: Mounting Huge Pages File System...
Jan 26 11:50:45 np0005596062 systemd: Mounting POSIX Message Queue File System...
Jan 26 11:50:45 np0005596062 systemd: Mounting Kernel Debug File System...
Jan 26 11:50:45 np0005596062 systemd: Mounting Kernel Trace File System...
Jan 26 11:50:45 np0005596062 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 11:50:45 np0005596062 systemd: Starting Create List of Static Device Nodes...
Jan 26 11:50:45 np0005596062 systemd: Starting Load Kernel Module configfs...
Jan 26 11:50:45 np0005596062 systemd: Starting Load Kernel Module drm...
Jan 26 11:50:45 np0005596062 systemd: Starting Load Kernel Module efi_pstore...
Jan 26 11:50:45 np0005596062 systemd: Starting Load Kernel Module fuse...
Jan 26 11:50:45 np0005596062 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 26 11:50:45 np0005596062 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Stopped File System Check on Root Device.
Jan 26 11:50:45 np0005596062 systemd: Stopped Journal Service.
Jan 26 11:50:45 np0005596062 systemd: Starting Journal Service...
Jan 26 11:50:45 np0005596062 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 11:50:45 np0005596062 systemd: Starting Generate network units from Kernel command line...
Jan 26 11:50:45 np0005596062 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 11:50:45 np0005596062 systemd: Starting Remount Root and Kernel File Systems...
Jan 26 11:50:45 np0005596062 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 26 11:50:45 np0005596062 systemd: Starting Apply Kernel Variables...
Jan 26 11:50:45 np0005596062 kernel: fuse: init (API version 7.37)
Jan 26 11:50:45 np0005596062 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 26 11:50:45 np0005596062 systemd: Starting Coldplug All udev Devices...
Jan 26 11:50:45 np0005596062 systemd: Mounted Huge Pages File System.
Jan 26 11:50:45 np0005596062 systemd: Mounted POSIX Message Queue File System.
Jan 26 11:50:45 np0005596062 systemd: Mounted Kernel Debug File System.
Jan 26 11:50:45 np0005596062 systemd: Mounted Kernel Trace File System.
Jan 26 11:50:45 np0005596062 systemd: Finished Create List of Static Device Nodes.
Jan 26 11:50:45 np0005596062 systemd: modprobe@configfs.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Finished Load Kernel Module configfs.
Jan 26 11:50:45 np0005596062 systemd: modprobe@efi_pstore.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Finished Load Kernel Module efi_pstore.
Jan 26 11:50:45 np0005596062 systemd: modprobe@fuse.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Finished Load Kernel Module fuse.
Jan 26 11:50:45 np0005596062 kernel: ACPI: bus type drm_connector registered
Jan 26 11:50:45 np0005596062 systemd: modprobe@drm.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Finished Load Kernel Module drm.
Jan 26 11:50:45 np0005596062 systemd-journald[678]: Journal started
Jan 26 11:50:45 np0005596062 systemd-journald[678]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 11:50:45 np0005596062 systemd[1]: Queued start job for default target Multi-User System.
Jan 26 11:50:45 np0005596062 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 26 11:50:45 np0005596062 systemd: Started Journal Service.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Generate network units from Kernel command line.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Apply Kernel Variables.
Jan 26 11:50:45 np0005596062 systemd[1]: Mounting FUSE Control File System...
Jan 26 11:50:45 np0005596062 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Rebuild Hardware Database...
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 26 11:50:45 np0005596062 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Load/Save OS Random Seed...
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Create System Users...
Jan 26 11:50:45 np0005596062 systemd-journald[678]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 11:50:45 np0005596062 systemd-journald[678]: Received client request to flush runtime journal.
Jan 26 11:50:45 np0005596062 systemd[1]: Mounted FUSE Control File System.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Load/Save OS Random Seed.
Jan 26 11:50:45 np0005596062 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Coldplug All udev Devices.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Create System Users.
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 11:50:45 np0005596062 systemd[1]: Reached target Preparation for Local File Systems.
Jan 26 11:50:45 np0005596062 systemd[1]: Reached target Local File Systems.
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 26 11:50:45 np0005596062 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 26 11:50:45 np0005596062 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 26 11:50:45 np0005596062 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Automatic Boot Loader Update...
Jan 26 11:50:45 np0005596062 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 26 11:50:45 np0005596062 systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 11:50:45 np0005596062 bootctl[697]: Couldn't find EFI system partition, skipping.
Jan 26 11:50:45 np0005596062 systemd[1]: Finished Automatic Boot Loader Update.
Jan 26 11:50:46 np0005596062 systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 11:50:46 np0005596062 systemd[1]: Starting Security Auditing Service...
Jan 26 11:50:46 np0005596062 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 26 11:50:46 np0005596062 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 26 11:50:46 np0005596062 systemd[1]: Starting RPC Bind...
Jan 26 11:50:46 np0005596062 systemd[1]: Starting Rebuild Journal Catalog...
Jan 26 11:50:46 np0005596062 systemd[1]: Finished Rebuild Journal Catalog.
Jan 26 11:50:46 np0005596062 systemd[1]: Started RPC Bind.
Jan 26 11:50:46 np0005596062 augenrules[708]: /sbin/augenrules: No change
Jan 26 11:50:46 np0005596062 augenrules[723]: No rules
Jan 26 11:50:46 np0005596062 augenrules[723]: enabled 1
Jan 26 11:50:46 np0005596062 augenrules[723]: failure 1
Jan 26 11:50:46 np0005596062 augenrules[723]: pid 701
Jan 26 11:50:46 np0005596062 augenrules[723]: rate_limit 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_limit 8192
Jan 26 11:50:46 np0005596062 augenrules[723]: lost 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog 2
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_wait_time 60000
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_wait_time_actual 0
Jan 26 11:50:46 np0005596062 augenrules[723]: enabled 1
Jan 26 11:50:46 np0005596062 augenrules[723]: failure 1
Jan 26 11:50:46 np0005596062 augenrules[723]: pid 701
Jan 26 11:50:46 np0005596062 augenrules[723]: rate_limit 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_limit 8192
Jan 26 11:50:46 np0005596062 augenrules[723]: lost 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_wait_time 60000
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_wait_time_actual 0
Jan 26 11:50:46 np0005596062 augenrules[723]: enabled 1
Jan 26 11:50:46 np0005596062 augenrules[723]: failure 1
Jan 26 11:50:46 np0005596062 augenrules[723]: pid 701
Jan 26 11:50:46 np0005596062 augenrules[723]: rate_limit 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_limit 8192
Jan 26 11:50:46 np0005596062 augenrules[723]: lost 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog 0
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_wait_time 60000
Jan 26 11:50:46 np0005596062 augenrules[723]: backlog_wait_time_actual 0
Jan 26 11:50:46 np0005596062 systemd[1]: Started Security Auditing Service.
Jan 26 11:50:46 np0005596062 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 26 11:50:46 np0005596062 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 26 11:50:46 np0005596062 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 26 11:50:47 np0005596062 systemd[1]: Finished Rebuild Hardware Database.
Jan 26 11:50:47 np0005596062 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 11:50:47 np0005596062 systemd[1]: Starting Update is Completed...
Jan 26 11:50:47 np0005596062 systemd[1]: Finished Update is Completed.
Jan 26 11:50:47 np0005596062 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 11:50:47 np0005596062 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 11:50:47 np0005596062 systemd[1]: Reached target System Initialization.
Jan 26 11:50:47 np0005596062 systemd[1]: Started dnf makecache --timer.
Jan 26 11:50:47 np0005596062 systemd[1]: Started Daily rotation of log files.
Jan 26 11:50:47 np0005596062 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 26 11:50:47 np0005596062 systemd[1]: Reached target Timer Units.
Jan 26 11:50:47 np0005596062 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 11:50:47 np0005596062 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 26 11:50:47 np0005596062 systemd[1]: Reached target Socket Units.
Jan 26 11:50:47 np0005596062 systemd[1]: Starting D-Bus System Message Bus...
Jan 26 11:50:47 np0005596062 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 11:50:47 np0005596062 systemd[1]: Starting Load Kernel Module configfs...
Jan 26 11:50:47 np0005596062 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 11:50:47 np0005596062 systemd[1]: Finished Load Kernel Module configfs.
Jan 26 11:50:47 np0005596062 systemd[1]: Started D-Bus System Message Bus.
Jan 26 11:50:47 np0005596062 systemd[1]: Reached target Basic System.
Jan 26 11:50:47 np0005596062 dbus-broker-lau[743]: Ready
Jan 26 11:50:47 np0005596062 systemd[1]: Starting NTP client/server...
Jan 26 11:50:47 np0005596062 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 26 11:50:47 np0005596062 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 26 11:50:47 np0005596062 systemd[1]: Starting IPv4 firewall with iptables...
Jan 26 11:50:47 np0005596062 systemd[1]: Started irqbalance daemon.
Jan 26 11:50:47 np0005596062 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 26 11:50:47 np0005596062 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 11:50:47 np0005596062 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 11:50:47 np0005596062 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 11:50:47 np0005596062 systemd[1]: Reached target sshd-keygen.target.
Jan 26 11:50:47 np0005596062 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 26 11:50:47 np0005596062 systemd[1]: Reached target User and Group Name Lookups.
Jan 26 11:50:47 np0005596062 systemd-udevd[757]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 11:50:48 np0005596062 chronyd[783]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 11:50:48 np0005596062 chronyd[783]: Loaded 0 symmetric keys
Jan 26 11:50:48 np0005596062 chronyd[783]: Using right/UTC timezone to obtain leap second data
Jan 26 11:50:48 np0005596062 chronyd[783]: Loaded seccomp filter (level 2)
Jan 26 11:50:48 np0005596062 systemd[1]: Starting User Login Management...
Jan 26 11:50:48 np0005596062 systemd[1]: Started NTP client/server.
Jan 26 11:50:48 np0005596062 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 26 11:50:48 np0005596062 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 26 11:50:49 np0005596062 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 26 11:50:49 np0005596062 cloud-init[800]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 26 Jan 2026 16:50:49 +0000. Up 9.48 seconds.
Jan 26 11:50:49 np0005596062 systemd-logind[781]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 11:50:49 np0005596062 systemd-logind[781]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 11:50:49 np0005596062 systemd-logind[781]: New seat seat0.
Jan 26 11:50:49 np0005596062 systemd[1]: Started User Login Management.
Jan 26 11:50:49 np0005596062 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 26 11:50:49 np0005596062 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 26 11:50:49 np0005596062 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 26 11:50:49 np0005596062 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 26 11:50:49 np0005596062 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 26 11:50:49 np0005596062 kernel: kvm_amd: TSC scaling supported
Jan 26 11:50:49 np0005596062 kernel: kvm_amd: Nested Virtualization enabled
Jan 26 11:50:49 np0005596062 kernel: kvm_amd: Nested Paging enabled
Jan 26 11:50:49 np0005596062 kernel: kvm_amd: LBR virtualization supported
Jan 26 11:50:49 np0005596062 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 26 11:50:49 np0005596062 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 26 11:50:49 np0005596062 kernel: Console: switching to colour dummy device 80x25
Jan 26 11:50:49 np0005596062 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 26 11:50:49 np0005596062 kernel: [drm] features: -context_init
Jan 26 11:50:49 np0005596062 kernel: [drm] number of scanouts: 1
Jan 26 11:50:49 np0005596062 kernel: [drm] number of cap sets: 0
Jan 26 11:50:49 np0005596062 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 26 11:50:49 np0005596062 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 26 11:50:49 np0005596062 kernel: Console: switching to colour frame buffer device 128x48
Jan 26 11:50:49 np0005596062 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 26 11:50:49 np0005596062 systemd[1]: run-cloud\x2dinit-tmp-tmp5wq_piaa.mount: Deactivated successfully.
Jan 26 11:50:49 np0005596062 iptables.init[774]: iptables: Applying firewall rules: [  OK  ]
Jan 26 11:50:49 np0005596062 systemd[1]: Finished IPv4 firewall with iptables.
Jan 26 11:50:49 np0005596062 systemd[1]: Starting Hostname Service...
Jan 26 11:50:49 np0005596062 systemd[1]: Started Hostname Service.
Jan 26 11:50:49 np0005596062 systemd-hostnamed[853]: Hostname set to <np0005596062.novalocal> (static)
Jan 26 11:50:49 np0005596062 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 26 11:50:49 np0005596062 systemd[1]: Reached target Preparation for Network.
Jan 26 11:50:49 np0005596062 systemd[1]: Starting Network Manager...
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9539] NetworkManager (version 1.54.3-2.el9) is starting... (boot:0500db80-16b3-49e6-bb63-ade1deb047ad)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9543] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9618] manager[0x557c01c5c000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9653] hostname: hostname: using hostnamed
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9654] hostname: static hostname changed from (none) to "np0005596062.novalocal"
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9658] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9766] manager[0x557c01c5c000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9767] manager[0x557c01c5c000]: rfkill: WWAN hardware radio set enabled
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9802] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9802] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9803] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9803] manager: Networking is enabled by state file
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9805] settings: Loaded settings plugin: keyfile (internal)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9814] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9829] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 11:50:49 np0005596062 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9840] dhcp: init: Using DHCP client 'internal'
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9842] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9851] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9858] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9865] device (lo): Activation: starting connection 'lo' (f73eba9c-44a7-4e51-ab55-16d275cdfcc3)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9872] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9875] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9900] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9903] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9905] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9907] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9908] device (eth0): carrier: link connected
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9911] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9917] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9921] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9924] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9925] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9927] manager: NetworkManager state is now CONNECTING
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9928] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9933] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9935] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9981] dhcp4 (eth0): state changed new lease, address=38.102.83.190
Jan 26 11:50:49 np0005596062 NetworkManager[857]: <info>  [1769446249.9987] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 11:50:50 np0005596062 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0004] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 11:50:50 np0005596062 systemd[1]: Started Network Manager.
Jan 26 11:50:50 np0005596062 systemd[1]: Reached target Network.
Jan 26 11:50:50 np0005596062 systemd[1]: Starting Network Manager Wait Online...
Jan 26 11:50:50 np0005596062 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 26 11:50:50 np0005596062 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0155] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0158] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0160] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0168] device (lo): Activation: successful, device activated.
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0176] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0180] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0183] device (eth0): Activation: successful, device activated.
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0187] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 11:50:50 np0005596062 NetworkManager[857]: <info>  [1769446250.0190] manager: startup complete
Jan 26 11:50:50 np0005596062 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 26 11:50:50 np0005596062 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 11:50:50 np0005596062 systemd[1]: Reached target NFS client services.
Jan 26 11:50:50 np0005596062 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 11:50:50 np0005596062 systemd[1]: Reached target Remote File Systems.
Jan 26 11:50:50 np0005596062 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 11:50:50 np0005596062 systemd[1]: Finished Network Manager Wait Online.
Jan 26 11:50:50 np0005596062 systemd[1]: Starting Cloud-init: Network Stage...
Jan 26 11:50:50 np0005596062 cloud-init[921]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 26 Jan 2026 16:50:50 +0000. Up 10.69 seconds.
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |  eth0  | True |        38.102.83.190         | 255.255.255.0 | global | fa:16:3e:ee:57:2b |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:feee:572b/64 |       .       |  link  | fa:16:3e:ee:57:2b |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 26 11:50:50 np0005596062 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 11:50:51 np0005596062 cloud-init[921]: Generating public/private rsa key pair.
Jan 26 11:50:51 np0005596062 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 26 11:50:51 np0005596062 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 26 11:50:51 np0005596062 cloud-init[921]: The key fingerprint is:
Jan 26 11:50:51 np0005596062 cloud-init[921]: SHA256:atGljd6lTDLy97hsg/ugfeUHFTTvMF0BZU9Xahmpaoo root@np0005596062.novalocal
Jan 26 11:50:51 np0005596062 cloud-init[921]: The key's randomart image is:
Jan 26 11:50:51 np0005596062 cloud-init[921]: +---[RSA 3072]----+
Jan 26 11:50:51 np0005596062 cloud-init[921]: |             o*=B|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |              +O+|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |          .  .*.+|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |       . =  ...+ |
Jan 26 11:50:51 np0005596062 cloud-init[921]: |      o S o...  .|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |       * *ooo    |
Jan 26 11:50:51 np0005596062 cloud-init[921]: |      o.++=o .   |
Jan 26 11:50:51 np0005596062 cloud-init[921]: |     .Eoo++o. .  |
Jan 26 11:50:51 np0005596062 cloud-init[921]: |      . o==o..   |
Jan 26 11:50:51 np0005596062 cloud-init[921]: +----[SHA256]-----+
Jan 26 11:50:51 np0005596062 cloud-init[921]: Generating public/private ecdsa key pair.
Jan 26 11:50:51 np0005596062 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 26 11:50:51 np0005596062 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 26 11:50:51 np0005596062 cloud-init[921]: The key fingerprint is:
Jan 26 11:50:51 np0005596062 cloud-init[921]: SHA256:eAuJSo6KredfcRbWDvNuc1O/5c7RER3ZeL9EMZWHXEI root@np0005596062.novalocal
Jan 26 11:50:51 np0005596062 cloud-init[921]: The key's randomart image is:
Jan 26 11:50:51 np0005596062 cloud-init[921]: +---[ECDSA 256]---+
Jan 26 11:50:51 np0005596062 cloud-init[921]: |             oEBO|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |         .    +=B|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |        = .   .o+|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |     . + *     .o|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |  . . = S o   o..|
Jan 26 11:50:51 np0005596062 cloud-init[921]: | + .   * o   . oo|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |. o   . . + o  .+|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |o..  .   . o . o+|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |++o..          o+|
Jan 26 11:50:51 np0005596062 cloud-init[921]: +----[SHA256]-----+
Jan 26 11:50:51 np0005596062 cloud-init[921]: Generating public/private ed25519 key pair.
Jan 26 11:50:51 np0005596062 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 26 11:50:51 np0005596062 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 26 11:50:51 np0005596062 cloud-init[921]: The key fingerprint is:
Jan 26 11:50:51 np0005596062 cloud-init[921]: SHA256:p3T2vNN1hD1ZVnquweRUmNdvfCJGzTDh7np9CQiEKL0 root@np0005596062.novalocal
Jan 26 11:50:51 np0005596062 cloud-init[921]: The key's randomart image is:
Jan 26 11:50:51 np0005596062 cloud-init[921]: +--[ED25519 256]--+
Jan 26 11:50:51 np0005596062 cloud-init[921]: |    . . .   += o=|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |   . o . . ...=o=|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |    . . .  .. +*=|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |     E   . .o=o=B|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |        S =.o.+++|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |       . = = . oo|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |        .   +oo.o|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |           .o.o..|
Jan 26 11:50:51 np0005596062 cloud-init[921]: |          .... . |
Jan 26 11:50:51 np0005596062 cloud-init[921]: +----[SHA256]-----+
Jan 26 11:50:51 np0005596062 systemd[1]: Finished Cloud-init: Network Stage.
Jan 26 11:50:51 np0005596062 systemd[1]: Reached target Cloud-config availability.
Jan 26 11:50:51 np0005596062 systemd[1]: Reached target Network is Online.
Jan 26 11:50:51 np0005596062 systemd[1]: Starting Cloud-init: Config Stage...
Jan 26 11:50:51 np0005596062 systemd[1]: Starting Crash recovery kernel arming...
Jan 26 11:50:51 np0005596062 systemd[1]: Starting Notify NFS peers of a restart...
Jan 26 11:50:51 np0005596062 systemd[1]: Starting System Logging Service...
Jan 26 11:50:51 np0005596062 systemd[1]: Starting OpenSSH server daemon...
Jan 26 11:50:51 np0005596062 sm-notify[1004]: Version 2.5.4 starting
Jan 26 11:50:51 np0005596062 systemd[1]: Starting Permit User Sessions...
Jan 26 11:50:51 np0005596062 systemd[1]: Started Notify NFS peers of a restart.
Jan 26 11:50:51 np0005596062 systemd[1]: Finished Permit User Sessions.
Jan 26 11:50:51 np0005596062 systemd[1]: Started OpenSSH server daemon.
Jan 26 11:50:51 np0005596062 systemd[1]: Started Command Scheduler.
Jan 26 11:50:51 np0005596062 systemd[1]: Started Getty on tty1.
Jan 26 11:50:51 np0005596062 systemd[1]: Started Serial Getty on ttyS0.
Jan 26 11:50:51 np0005596062 systemd[1]: Reached target Login Prompts.
Jan 26 11:50:51 np0005596062 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Jan 26 11:50:51 np0005596062 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 26 11:50:51 np0005596062 systemd[1]: Started System Logging Service.
Jan 26 11:50:52 np0005596062 systemd[1]: Reached target Multi-User System.
Jan 26 11:50:52 np0005596062 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 26 11:50:52 np0005596062 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 26 11:50:52 np0005596062 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 26 11:50:52 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 11:50:52 np0005596062 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Jan 26 11:50:52 np0005596062 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 26 11:50:52 np0005596062 cloud-init[1193]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 26 Jan 2026 16:50:52 +0000. Up 12.61 seconds.
Jan 26 11:50:52 np0005596062 systemd[1]: Finished Cloud-init: Config Stage.
Jan 26 11:50:52 np0005596062 systemd[1]: Starting Cloud-init: Final Stage...
Jan 26 11:50:52 np0005596062 dracut[1265]: dracut-057-102.git20250818.el9
Jan 26 11:50:52 np0005596062 dracut[1267]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 26 11:50:52 np0005596062 cloud-init[1344]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 26 Jan 2026 16:50:52 +0000. Up 13.10 seconds.
Jan 26 11:50:52 np0005596062 cloud-init[1361]: #############################################################
Jan 26 11:50:52 np0005596062 cloud-init[1362]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 26 11:50:52 np0005596062 cloud-init[1364]: 256 SHA256:eAuJSo6KredfcRbWDvNuc1O/5c7RER3ZeL9EMZWHXEI root@np0005596062.novalocal (ECDSA)
Jan 26 11:50:52 np0005596062 cloud-init[1369]: 256 SHA256:p3T2vNN1hD1ZVnquweRUmNdvfCJGzTDh7np9CQiEKL0 root@np0005596062.novalocal (ED25519)
Jan 26 11:50:52 np0005596062 cloud-init[1374]: 3072 SHA256:atGljd6lTDLy97hsg/ugfeUHFTTvMF0BZU9Xahmpaoo root@np0005596062.novalocal (RSA)
Jan 26 11:50:52 np0005596062 cloud-init[1375]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 26 11:50:52 np0005596062 cloud-init[1376]: #############################################################
Jan 26 11:50:53 np0005596062 cloud-init[1344]: Cloud-init v. 24.4-8.el9 finished at Mon, 26 Jan 2026 16:50:52 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.26 seconds
Jan 26 11:50:53 np0005596062 systemd[1]: Finished Cloud-init: Final Stage.
Jan 26 11:50:53 np0005596062 systemd[1]: Reached target Cloud-init target.
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: memstrack is not available
Jan 26 11:50:53 np0005596062 dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 11:50:53 np0005596062 dracut[1267]: memstrack is not available
Jan 26 11:50:53 np0005596062 dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 11:50:54 np0005596062 dracut[1267]: *** Including module: systemd ***
Jan 26 11:50:54 np0005596062 dracut[1267]: *** Including module: fips ***
Jan 26 11:50:54 np0005596062 dracut[1267]: *** Including module: systemd-initrd ***
Jan 26 11:50:54 np0005596062 dracut[1267]: *** Including module: i18n ***
Jan 26 11:50:54 np0005596062 dracut[1267]: *** Including module: drm ***
Jan 26 11:50:54 np0005596062 chronyd[783]: Selected source 23.159.16.194 (2.centos.pool.ntp.org)
Jan 26 11:50:54 np0005596062 chronyd[783]: System clock TAI offset set to 37 seconds
Jan 26 11:50:55 np0005596062 dracut[1267]: *** Including module: prefixdevname ***
Jan 26 11:50:55 np0005596062 dracut[1267]: *** Including module: kernel-modules ***
Jan 26 11:50:55 np0005596062 kernel: block vda: the capability attribute has been deprecated.
Jan 26 11:50:55 np0005596062 dracut[1267]: *** Including module: kernel-modules-extra ***
Jan 26 11:50:55 np0005596062 dracut[1267]: *** Including module: qemu ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: fstab-sys ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: rootfs-block ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: terminfo ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: udev-rules ***
Jan 26 11:50:56 np0005596062 chronyd[783]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Jan 26 11:50:56 np0005596062 dracut[1267]: Skipping udev rule: 91-permissions.rules
Jan 26 11:50:56 np0005596062 dracut[1267]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: virtiofs ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: dracut-systemd ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: usrmount ***
Jan 26 11:50:56 np0005596062 dracut[1267]: *** Including module: base ***
Jan 26 11:50:57 np0005596062 dracut[1267]: *** Including module: fs-lib ***
Jan 26 11:50:57 np0005596062 dracut[1267]: *** Including module: kdumpbase ***
Jan 26 11:50:57 np0005596062 dracut[1267]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 26 11:50:57 np0005596062 dracut[1267]:  microcode_ctl module: mangling fw_dir
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 26 11:50:57 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 26 11:50:58 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 26 11:50:58 np0005596062 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 26 11:50:58 np0005596062 dracut[1267]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 26 11:50:58 np0005596062 dracut[1267]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 26 11:50:58 np0005596062 dracut[1267]: *** Including module: openssl ***
Jan 26 11:50:58 np0005596062 dracut[1267]: *** Including module: shutdown ***
Jan 26 11:50:58 np0005596062 dracut[1267]: *** Including module: squash ***
Jan 26 11:50:58 np0005596062 dracut[1267]: *** Including modules done ***
Jan 26 11:50:58 np0005596062 dracut[1267]: *** Installing kernel module dependencies ***
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 35 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 33 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 31 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 28 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 34 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 32 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 30 affinity is now unmanaged
Jan 26 11:50:58 np0005596062 irqbalance[777]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 26 11:50:58 np0005596062 irqbalance[777]: IRQ 29 affinity is now unmanaged
Jan 26 11:50:59 np0005596062 dracut[1267]: *** Installing kernel module dependencies done ***
Jan 26 11:50:59 np0005596062 dracut[1267]: *** Resolving executable dependencies ***
Jan 26 11:51:00 np0005596062 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 11:51:00 np0005596062 dracut[1267]: *** Resolving executable dependencies done ***
Jan 26 11:51:00 np0005596062 dracut[1267]: *** Generating early-microcode cpio image ***
Jan 26 11:51:00 np0005596062 dracut[1267]: *** Store current command line parameters ***
Jan 26 11:51:00 np0005596062 dracut[1267]: Stored kernel commandline:
Jan 26 11:51:00 np0005596062 dracut[1267]: No dracut internal kernel commandline stored in the initramfs
Jan 26 11:51:00 np0005596062 dracut[1267]: *** Install squash loader ***
Jan 26 11:51:01 np0005596062 dracut[1267]: *** Squashing the files inside the initramfs ***
Jan 26 11:51:03 np0005596062 dracut[1267]: *** Squashing the files inside the initramfs done ***
Jan 26 11:51:03 np0005596062 dracut[1267]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 26 11:51:03 np0005596062 dracut[1267]: *** Hardlinking files ***
Jan 26 11:51:03 np0005596062 dracut[1267]: *** Hardlinking files done ***
Jan 26 11:51:04 np0005596062 dracut[1267]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 26 11:51:05 np0005596062 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Jan 26 11:51:05 np0005596062 kdumpctl[1018]: kdump: Starting kdump: [OK]
Jan 26 11:51:05 np0005596062 systemd[1]: Finished Crash recovery kernel arming.
Jan 26 11:51:05 np0005596062 systemd[1]: Startup finished in 2.147s (kernel) + 2.902s (initrd) + 20.590s (userspace) = 25.640s.
Jan 26 11:51:19 np0005596062 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 12:00:14 np0005596062 systemd[1]: Created slice User Slice of UID 1000.
Jan 26 12:00:14 np0005596062 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 26 12:00:14 np0005596062 systemd-logind[781]: New session 1 of user zuul.
Jan 26 12:00:14 np0005596062 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 26 12:00:14 np0005596062 systemd[1]: Starting User Manager for UID 1000...
Jan 26 12:00:14 np0005596062 systemd[4313]: Queued start job for default target Main User Target.
Jan 26 12:00:14 np0005596062 systemd[4313]: Created slice User Application Slice.
Jan 26 12:00:14 np0005596062 systemd[4313]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 12:00:14 np0005596062 systemd[4313]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 12:00:14 np0005596062 systemd[4313]: Reached target Paths.
Jan 26 12:00:14 np0005596062 systemd[4313]: Reached target Timers.
Jan 26 12:00:14 np0005596062 systemd[4313]: Starting D-Bus User Message Bus Socket...
Jan 26 12:00:14 np0005596062 systemd[4313]: Starting Create User's Volatile Files and Directories...
Jan 26 12:00:14 np0005596062 systemd[4313]: Finished Create User's Volatile Files and Directories.
Jan 26 12:00:14 np0005596062 systemd[4313]: Listening on D-Bus User Message Bus Socket.
Jan 26 12:00:14 np0005596062 systemd[4313]: Reached target Sockets.
Jan 26 12:00:14 np0005596062 systemd[4313]: Reached target Basic System.
Jan 26 12:00:14 np0005596062 systemd[4313]: Reached target Main User Target.
Jan 26 12:00:14 np0005596062 systemd[4313]: Startup finished in 116ms.
Jan 26 12:00:14 np0005596062 systemd[1]: Started User Manager for UID 1000.
Jan 26 12:00:14 np0005596062 systemd[1]: Started Session 1 of User zuul.
Jan 26 12:00:14 np0005596062 python3[4396]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:00:19 np0005596062 python3[4424]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:00:26 np0005596062 python3[4484]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:00:27 np0005596062 python3[4524]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 26 12:00:29 np0005596062 python3[4550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzdr5ekFaPx5lHCHhmmyyey8qVCMeTV+mIzJTvZGts8fWfChGIy6y7YCGlmkytpqPA07Fdi16KsS1gXNTGiDaGesXpaNaE+VxEl1z2rMUDI2agXur5kwAnnLX6ecRHowjHbfU1zfjLXFqAMHYc0aCPRCp060fLIuO4nlwJ3GWq0ye5H1ZVELwGDayCuDWzbK5aHDztQdNJDgdy9OPuZ8b+K9F7fbWU1Z+dBU7m5IN5KjKFd/cPNSHsK6ON+/Sfi4qtk8jBXQYpM1BizgXu33re8tOhjys5ZQoV9DYya4bJkXiff+Ruz4U28Pu9uh4FkhbYSpG9Y1LTnlG2kGmI4atVVR7gSRZv/2LznHdwcFRHyX7kKVFwYvWMjYumEpe5bfQIF9XXoeFhFEMeEpl3jwZGQKFDFakCMaU4DYm0kDhjP3TXPwc1qih9KawhQ/+M5yhHRmTfFnaue4dl4qdaYLxvciw6hzU/3xhhgXvi22OXk3iReBOKJZxM0/S5k0VAG2c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:30 np0005596062 python3[4574]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:30 np0005596062 python3[4673]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:31 np0005596062 python3[4744]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769446830.412215-253-185302690531942/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=80829f0cc97a4a10a8c6e238c5ab9a25_id_rsa follow=False checksum=6709c318edb1fd99be951f08f8e495e1e3755a4f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:31 np0005596062 python3[4867]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:32 np0005596062 python3[4938]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769446831.3483884-308-93044011107997/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=80829f0cc97a4a10a8c6e238c5ab9a25_id_rsa.pub follow=False checksum=a347889044a62ef15060bc27e1ab0fab9aa4e666 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:33 np0005596062 python3[4986]: ansible-ping Invoked with data=pong
Jan 26 12:00:34 np0005596062 python3[5010]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:00:36 np0005596062 python3[5068]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 26 12:00:37 np0005596062 python3[5100]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:37 np0005596062 python3[5124]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:38 np0005596062 python3[5148]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:38 np0005596062 python3[5172]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:38 np0005596062 python3[5196]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:39 np0005596062 python3[5220]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:40 np0005596062 python3[5246]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:41 np0005596062 python3[5324]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:41 np0005596062 python3[5397]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769446840.987899-34-46905289328427/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:42 np0005596062 python3[5445]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:42 np0005596062 python3[5469]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:43 np0005596062 python3[5493]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:43 np0005596062 python3[5517]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:43 np0005596062 python3[5541]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:44 np0005596062 python3[5565]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:44 np0005596062 python3[5589]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:44 np0005596062 python3[5613]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:44 np0005596062 python3[5637]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:45 np0005596062 python3[5661]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:45 np0005596062 python3[5685]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:45 np0005596062 python3[5709]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:46 np0005596062 python3[5733]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:46 np0005596062 python3[5757]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:46 np0005596062 python3[5781]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:46 np0005596062 python3[5805]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:47 np0005596062 python3[5829]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:47 np0005596062 python3[5853]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:47 np0005596062 python3[5877]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:48 np0005596062 python3[5901]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:48 np0005596062 python3[5925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:48 np0005596062 python3[5949]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:48 np0005596062 python3[5973]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:49 np0005596062 python3[5997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:49 np0005596062 python3[6021]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:49 np0005596062 python3[6045]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:00:52 np0005596062 python3[6071]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 12:00:52 np0005596062 systemd[1]: Starting Time & Date Service...
Jan 26 12:00:52 np0005596062 systemd[1]: Started Time & Date Service.
Jan 26 12:00:52 np0005596062 systemd-timedated[6073]: Changed time zone to 'UTC' (UTC).
Jan 26 12:00:52 np0005596062 python3[6102]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:53 np0005596062 python3[6178]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:53 np0005596062 python3[6249]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769446852.829782-254-193317906507532/source _original_basename=tmpjw_fa53u follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:54 np0005596062 python3[6349]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:54 np0005596062 python3[6420]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769446854.3096225-304-205863441538206/source _original_basename=tmp4lne6d35 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:55 np0005596062 python3[6522]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:55 np0005596062 python3[6595]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769446855.3330593-385-115793100022611/source _original_basename=tmpmcjah1du follow=False checksum=0278a60fa9fcb701d8ddd2d2d748895769827669 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:56 np0005596062 python3[6643]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:00:56 np0005596062 python3[6669]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:00:57 np0005596062 python3[6749]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:00:57 np0005596062 python3[6822]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769446856.9610984-454-50732929226370/source _original_basename=tmp_avoy5sc follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:00:58 np0005596062 python3[6873]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-d7b6-8e1f-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:00:58 np0005596062 python3[6901]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d7b6-8e1f-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 26 12:01:00 np0005596062 python3[6929]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:01:22 np0005596062 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 12:01:23 np0005596062 python3[6972]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:02:23 np0005596062 systemd-logind[781]: Session 1 logged out. Waiting for processes to exit.
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 26 12:02:54 np0005596062 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 26 12:02:54 np0005596062 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3116] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 12:02:54 np0005596062 systemd-udevd[6975]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3257] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3286] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3289] device (eth1): carrier: link connected
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3291] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3296] policy: auto-activating connection 'Wired connection 1' (df5cc3c6-f91b-32fd-9da3-aee41bff0f12)
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3300] device (eth1): Activation: starting connection 'Wired connection 1' (df5cc3c6-f91b-32fd-9da3-aee41bff0f12)
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3300] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3302] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3306] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:02:54 np0005596062 NetworkManager[857]: <info>  [1769446974.3310] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:02:54 np0005596062 systemd[4313]: Starting Mark boot as successful...
Jan 26 12:02:54 np0005596062 systemd[4313]: Finished Mark boot as successful.
Jan 26 12:02:55 np0005596062 systemd-logind[781]: New session 3 of user zuul.
Jan 26 12:02:55 np0005596062 systemd[1]: Started Session 3 of User zuul.
Jan 26 12:02:55 np0005596062 python3[7007]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-4b93-3762-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:03:05 np0005596062 python3[7087]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:03:05 np0005596062 python3[7160]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769446984.9910395-206-85427497107599/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=30f1d1334a439ac986cc240aef51f00c96e4aa6a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:03:06 np0005596062 python3[7210]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:03:06 np0005596062 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 12:03:06 np0005596062 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 12:03:06 np0005596062 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 12:03:06 np0005596062 systemd[1]: Stopping Network Manager...
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3540] caught SIGTERM, shutting down normally.
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3552] dhcp4 (eth0): canceled DHCP transaction
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3552] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3552] dhcp4 (eth0): state changed no lease
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3554] manager: NetworkManager state is now CONNECTING
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3697] dhcp4 (eth1): canceled DHCP transaction
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3697] dhcp4 (eth1): state changed no lease
Jan 26 12:03:06 np0005596062 NetworkManager[857]: <info>  [1769446986.3760] exiting (success)
Jan 26 12:03:06 np0005596062 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 12:03:06 np0005596062 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 12:03:06 np0005596062 systemd[1]: Stopped Network Manager.
Jan 26 12:03:06 np0005596062 systemd[1]: NetworkManager.service: Consumed 4.906s CPU time, 10.0M memory peak.
Jan 26 12:03:06 np0005596062 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 12:03:06 np0005596062 systemd[1]: Starting Network Manager...
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.4388] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0500db80-16b3-49e6-bb63-ade1deb047ad)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.4389] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.4451] manager[0x55e846738000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 12:03:06 np0005596062 systemd[1]: Starting Hostname Service...
Jan 26 12:03:06 np0005596062 systemd[1]: Started Hostname Service.
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5217] hostname: hostname: using hostnamed
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5219] hostname: static hostname changed from (none) to "np0005596062.novalocal"
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5226] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5233] manager[0x55e846738000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5233] manager[0x55e846738000]: rfkill: WWAN hardware radio set enabled
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5266] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5266] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5267] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5267] manager: Networking is enabled by state file
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5270] settings: Loaded settings plugin: keyfile (internal)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5275] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5305] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5315] dhcp: init: Using DHCP client 'internal'
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5318] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5325] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5333] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5343] device (lo): Activation: starting connection 'lo' (f73eba9c-44a7-4e51-ab55-16d275cdfcc3)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5350] device (eth0): carrier: link connected
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5355] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5361] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5362] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5370] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5377] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5384] device (eth1): carrier: link connected
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5389] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5394] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (df5cc3c6-f91b-32fd-9da3-aee41bff0f12) (indicated)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5395] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5401] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5409] device (eth1): Activation: starting connection 'Wired connection 1' (df5cc3c6-f91b-32fd-9da3-aee41bff0f12)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5415] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 12:03:06 np0005596062 systemd[1]: Started Network Manager.
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5421] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5424] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5426] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5429] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5432] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5435] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5438] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5453] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5471] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5477] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5493] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5499] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5521] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5529] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5539] device (lo): Activation: successful, device activated.
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5550] dhcp4 (eth0): state changed new lease, address=38.102.83.190
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5562] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 12:03:06 np0005596062 systemd[1]: Starting Network Manager Wait Online...
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5660] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5701] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5703] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5706] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5708] device (eth0): Activation: successful, device activated.
Jan 26 12:03:06 np0005596062 NetworkManager[7218]: <info>  [1769446986.5712] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 12:03:06 np0005596062 python3[7295]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-4b93-3762-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:03:16 np0005596062 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 12:03:36 np0005596062 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7272] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 12:03:51 np0005596062 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 12:03:51 np0005596062 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7598] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7600] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7607] device (eth1): Activation: successful, device activated.
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7612] manager: startup complete
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7617] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <warn>  [1769447031.7621] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7628] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 systemd[1]: Finished Network Manager Wait Online.
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7737] dhcp4 (eth1): canceled DHCP transaction
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7738] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7739] dhcp4 (eth1): state changed no lease
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7755] policy: auto-activating connection 'ci-private-network' (bae2aad0-b0a0-5029-8b43-136c50f17dfb)
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7760] device (eth1): Activation: starting connection 'ci-private-network' (bae2aad0-b0a0-5029-8b43-136c50f17dfb)
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7762] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7765] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7775] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7787] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7834] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7837] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:03:51 np0005596062 NetworkManager[7218]: <info>  [1769447031.7844] device (eth1): Activation: successful, device activated.
Jan 26 12:04:01 np0005596062 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 12:04:06 np0005596062 systemd[1]: session-3.scope: Deactivated successfully.
Jan 26 12:04:06 np0005596062 systemd[1]: session-3.scope: Consumed 1.723s CPU time.
Jan 26 12:04:06 np0005596062 systemd-logind[781]: Session 3 logged out. Waiting for processes to exit.
Jan 26 12:04:06 np0005596062 systemd-logind[781]: Removed session 3.
Jan 26 12:04:21 np0005596062 systemd-logind[781]: New session 4 of user zuul.
Jan 26 12:04:21 np0005596062 systemd[1]: Started Session 4 of User zuul.
Jan 26 12:04:21 np0005596062 python3[7405]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:04:22 np0005596062 python3[7478]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769447061.621101-373-104014120234467/source _original_basename=tmp5no2db0s follow=False checksum=ff12fd3b26ba1169babfae82900b18cee99f46f8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:04:24 np0005596062 systemd[1]: session-4.scope: Deactivated successfully.
Jan 26 12:04:24 np0005596062 systemd-logind[781]: Session 4 logged out. Waiting for processes to exit.
Jan 26 12:04:24 np0005596062 systemd-logind[781]: Removed session 4.
Jan 26 12:06:01 np0005596062 systemd[4313]: Created slice User Background Tasks Slice.
Jan 26 12:06:01 np0005596062 systemd[4313]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 12:06:01 np0005596062 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 26 12:06:01 np0005596062 systemd[4313]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 12:06:01 np0005596062 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 26 12:06:01 np0005596062 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 26 12:06:01 np0005596062 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 26 12:10:24 np0005596062 systemd-logind[781]: New session 5 of user zuul.
Jan 26 12:10:24 np0005596062 systemd[1]: Started Session 5 of User zuul.
Jan 26 12:10:24 np0005596062 python3[7542]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-1288-0b10-000000000caa-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:10:25 np0005596062 python3[7571]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:10:25 np0005596062 python3[7597]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:10:25 np0005596062 python3[7623]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:10:26 np0005596062 python3[7649]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:10:26 np0005596062 python3[7675]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:10:26 np0005596062 python3[7753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:10:27 np0005596062 python3[7826]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769447426.68194-370-241756229540957/source _original_basename=tmpy4sdn3_n follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:10:28 np0005596062 python3[7876]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 12:10:28 np0005596062 systemd[1]: Reloading.
Jan 26 12:10:28 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:10:30 np0005596062 python3[7932]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 26 12:10:31 np0005596062 python3[7958]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:10:31 np0005596062 python3[7986]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:10:31 np0005596062 python3[8014]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:10:31 np0005596062 python3[8042]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:10:32 np0005596062 python3[8069]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-1288-0b10-000000000cb1-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:10:33 np0005596062 python3[8099]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 12:10:36 np0005596062 systemd[1]: session-5.scope: Deactivated successfully.
Jan 26 12:10:36 np0005596062 systemd[1]: session-5.scope: Consumed 4.338s CPU time.
Jan 26 12:10:36 np0005596062 systemd-logind[781]: Session 5 logged out. Waiting for processes to exit.
Jan 26 12:10:36 np0005596062 systemd-logind[781]: Removed session 5.
Jan 26 12:10:38 np0005596062 systemd-logind[781]: New session 6 of user zuul.
Jan 26 12:10:38 np0005596062 systemd[1]: Started Session 6 of User zuul.
Jan 26 12:10:38 np0005596062 python3[8132]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 12:10:44 np0005596062 setsebool[8174]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 26 12:10:44 np0005596062 setsebool[8174]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 26 12:10:55 np0005596062 kernel: SELinux:  Converting 386 SID table entries...
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:10:55 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:11:05 np0005596062 kernel: SELinux:  Converting 389 SID table entries...
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:11:05 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:11:23 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 12:11:24 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:11:24 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:11:24 np0005596062 systemd[1]: Reloading.
Jan 26 12:11:24 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:11:24 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:11:48 np0005596062 python3[20573]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-084f-674e-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:11:48 np0005596062 kernel: evm: overlay not supported
Jan 26 12:11:49 np0005596062 systemd[4313]: Starting D-Bus User Message Bus...
Jan 26 12:11:49 np0005596062 dbus-broker-launch[21023]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 26 12:11:49 np0005596062 dbus-broker-launch[21023]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 26 12:11:49 np0005596062 systemd[4313]: Started D-Bus User Message Bus.
Jan 26 12:11:49 np0005596062 dbus-broker-lau[21023]: Ready
Jan 26 12:11:49 np0005596062 systemd[4313]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 12:11:49 np0005596062 systemd[4313]: Created slice Slice /user.
Jan 26 12:11:49 np0005596062 systemd[4313]: podman-20955.scope: unit configures an IP firewall, but not running as root.
Jan 26 12:11:49 np0005596062 systemd[4313]: (This warning is only shown for the first unit using IP firewalling.)
Jan 26 12:11:49 np0005596062 systemd[4313]: Started podman-20955.scope.
Jan 26 12:11:49 np0005596062 systemd[4313]: Started podman-pause-0b6b356d.scope.
Jan 26 12:11:51 np0005596062 python3[21930]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.22:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.22:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:11:51 np0005596062 python3[21930]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 26 12:11:51 np0005596062 systemd[1]: session-6.scope: Deactivated successfully.
Jan 26 12:11:51 np0005596062 systemd[1]: session-6.scope: Consumed 42.917s CPU time.
Jan 26 12:11:51 np0005596062 systemd-logind[781]: Session 6 logged out. Waiting for processes to exit.
Jan 26 12:11:51 np0005596062 systemd-logind[781]: Removed session 6.
Jan 26 12:12:13 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:12:13 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:12:13 np0005596062 systemd[1]: man-db-cache-update.service: Consumed 59.174s CPU time.
Jan 26 12:12:13 np0005596062 systemd[1]: run-rcbcb005329dc47e3a7be1f8623a401a6.service: Deactivated successfully.
Jan 26 12:12:21 np0005596062 systemd-logind[781]: New session 7 of user zuul.
Jan 26 12:12:21 np0005596062 systemd[1]: Started Session 7 of User zuul.
Jan 26 12:12:22 np0005596062 python3[29654]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC2KNOrliIHVpRkoYSANCIWkSJEaoIB3ID7izEiG92Sz9ZDLnB8Yf+FcZuIYW5FpyTRAiW5K324Zpl7LaJD12Jw= zuul@np0005596059.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:12:22 np0005596062 python3[29680]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC2KNOrliIHVpRkoYSANCIWkSJEaoIB3ID7izEiG92Sz9ZDLnB8Yf+FcZuIYW5FpyTRAiW5K324Zpl7LaJD12Jw= zuul@np0005596059.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:12:23 np0005596062 python3[29706]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005596062.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 26 12:12:24 np0005596062 python3[29740]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC2KNOrliIHVpRkoYSANCIWkSJEaoIB3ID7izEiG92Sz9ZDLnB8Yf+FcZuIYW5FpyTRAiW5K324Zpl7LaJD12Jw= zuul@np0005596059.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 12:12:24 np0005596062 python3[29818]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:12:25 np0005596062 python3[29891]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769447544.586751-170-205562127193169/source _original_basename=tmptg7c7npv follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:12:26 np0005596062 python3[29941]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 26 12:12:26 np0005596062 systemd[1]: Starting Hostname Service...
Jan 26 12:12:26 np0005596062 systemd[1]: Started Hostname Service.
Jan 26 12:12:26 np0005596062 systemd-hostnamed[29945]: Changed pretty hostname to 'compute-2'
Jan 26 12:12:26 np0005596062 systemd-hostnamed[29945]: Hostname set to <compute-2> (static)
Jan 26 12:12:26 np0005596062 NetworkManager[7218]: <info>  [1769447546.4356] hostname: static hostname changed from "np0005596062.novalocal" to "compute-2"
Jan 26 12:12:26 np0005596062 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 12:12:26 np0005596062 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 12:12:26 np0005596062 systemd[1]: session-7.scope: Deactivated successfully.
Jan 26 12:12:26 np0005596062 systemd[1]: session-7.scope: Consumed 2.641s CPU time.
Jan 26 12:12:26 np0005596062 systemd-logind[781]: Session 7 logged out. Waiting for processes to exit.
Jan 26 12:12:26 np0005596062 systemd-logind[781]: Removed session 7.
Jan 26 12:12:36 np0005596062 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 12:12:56 np0005596062 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 12:16:53 np0005596062 systemd-logind[781]: New session 8 of user zuul.
Jan 26 12:16:53 np0005596062 systemd[1]: Started Session 8 of User zuul.
Jan 26 12:16:53 np0005596062 python3[30050]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:16:55 np0005596062 python3[30166]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:16:56 np0005596062 python3[30239]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:16:56 np0005596062 python3[30265]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:16:56 np0005596062 python3[30338]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:16:57 np0005596062 python3[30364]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:16:57 np0005596062 python3[30437]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:16:57 np0005596062 python3[30463]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:16:58 np0005596062 python3[30536]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:16:58 np0005596062 python3[30562]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:16:58 np0005596062 python3[30635]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:16:59 np0005596062 python3[30661]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:16:59 np0005596062 python3[30734]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:16:59 np0005596062 python3[30760]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:17:00 np0005596062 python3[30833]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769447815.4353328-34069-173287726876155/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:17:12 np0005596062 python3[30881]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:22:12 np0005596062 systemd[1]: session-8.scope: Deactivated successfully.
Jan 26 12:22:12 np0005596062 systemd-logind[781]: Session 8 logged out. Waiting for processes to exit.
Jan 26 12:22:12 np0005596062 systemd[1]: session-8.scope: Consumed 5.499s CPU time.
Jan 26 12:22:12 np0005596062 systemd-logind[781]: Removed session 8.
Jan 26 12:27:01 np0005596062 systemd[1]: Starting dnf makecache...
Jan 26 12:27:01 np0005596062 dnf[30893]: Failed determining last makecache time.
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-barbican-42b4c41831408a8e323 289 kB/s |  13 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.6 MB/s |  65 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-python-stevedore-c4acc5639fd2329372142 4.5 MB/s | 131 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.1 MB/s |  32 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-os-refresh-config-9bfc52b5049be2d8de61  10 MB/s | 349 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.7 MB/s |  42 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-python-designate-tests-tempest-347fdbc 793 kB/s |  18 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-glance-1fd12c29b339f30fe823e 753 kB/s |  18 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.4 MB/s |  29 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-manila-3c01b7181572c95dac462 1.2 MB/s |  25 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-python-whitebox-neutron-tests-tempest- 6.1 MB/s | 154 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-octavia-ba397f07a7331190208c 1.0 MB/s |  26 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-watcher-c014f81a8647287f6dcc 690 kB/s |  16 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-ansible-config_template-5ccaa22121a7ff 300 kB/s | 7.4 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.5 MB/s | 144 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-swift-dc98a8463506ac520c469a 602 kB/s |  14 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-python-tempestconf-8515371b7cceebd4282 2.4 MB/s |  53 kB     00:00
Jan 26 12:27:02 np0005596062 dnf[30893]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.6 MB/s |  96 kB     00:00
Jan 26 12:27:03 np0005596062 dnf[30893]: CentOS Stream 9 - BaseOS                         66 kB/s | 6.7 kB     00:00
Jan 26 12:27:03 np0005596062 dnf[30893]: CentOS Stream 9 - AppStream                      29 kB/s | 6.8 kB     00:00
Jan 26 12:27:03 np0005596062 dnf[30893]: CentOS Stream 9 - CRB                            61 kB/s | 6.6 kB     00:00
Jan 26 12:27:03 np0005596062 dnf[30893]: CentOS Stream 9 - Extras packages                73 kB/s | 7.3 kB     00:00
Jan 26 12:27:03 np0005596062 dnf[30893]: dlrn-antelope-testing                            28 MB/s | 1.1 MB     00:00
Jan 26 12:27:04 np0005596062 dnf[30893]: dlrn-antelope-build-deps                         17 MB/s | 461 kB     00:00
Jan 26 12:27:04 np0005596062 dnf[30893]: centos9-rabbitmq                                8.0 MB/s | 123 kB     00:00
Jan 26 12:27:04 np0005596062 dnf[30893]: centos9-storage                                  19 MB/s | 415 kB     00:00
Jan 26 12:27:04 np0005596062 dnf[30893]: centos9-opstools                                4.4 MB/s |  51 kB     00:00
Jan 26 12:27:04 np0005596062 dnf[30893]: NFV SIG OpenvSwitch                              22 MB/s | 461 kB     00:00
Jan 26 12:27:05 np0005596062 dnf[30893]: repo-setup-centos-appstream                      86 MB/s |  26 MB     00:00
Jan 26 12:27:11 np0005596062 dnf[30893]: repo-setup-centos-baseos                         74 MB/s | 8.9 MB     00:00
Jan 26 12:27:13 np0005596062 dnf[30893]: repo-setup-centos-highavailability               28 MB/s | 744 kB     00:00
Jan 26 12:27:13 np0005596062 dnf[30893]: repo-setup-centos-powertools                     74 MB/s | 7.6 MB     00:00
Jan 26 12:27:16 np0005596062 dnf[30893]: Extra Packages for Enterprise Linux 9 - x86_64   23 MB/s |  20 MB     00:00
Jan 26 12:27:31 np0005596062 dnf[30893]: Metadata cache created.
Jan 26 12:27:31 np0005596062 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 26 12:27:31 np0005596062 systemd[1]: Finished dnf makecache.
Jan 26 12:27:31 np0005596062 systemd[1]: dnf-makecache.service: Consumed 27.572s CPU time.
Jan 26 12:29:55 np0005596062 systemd-logind[781]: New session 9 of user zuul.
Jan 26 12:29:55 np0005596062 systemd[1]: Started Session 9 of User zuul.
Jan 26 12:29:56 np0005596062 python3.9[31151]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:29:57 np0005596062 python3.9[31332]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:30:07 np0005596062 systemd[1]: session-9.scope: Deactivated successfully.
Jan 26 12:30:07 np0005596062 systemd[1]: session-9.scope: Consumed 7.949s CPU time.
Jan 26 12:30:07 np0005596062 systemd-logind[781]: Session 9 logged out. Waiting for processes to exit.
Jan 26 12:30:07 np0005596062 systemd-logind[781]: Removed session 9.
Jan 26 12:30:22 np0005596062 systemd-logind[781]: New session 10 of user zuul.
Jan 26 12:30:22 np0005596062 systemd[1]: Started Session 10 of User zuul.
Jan 26 12:30:23 np0005596062 python3.9[31542]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 12:30:24 np0005596062 python3.9[31716]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:30:25 np0005596062 python3.9[31868]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:30:26 np0005596062 python3.9[32021]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:30:27 np0005596062 python3.9[32173]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:30:27 np0005596062 python3.9[32325]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:30:28 np0005596062 python3.9[32448]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448627.4500794-180-145139743267585/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:30:29 np0005596062 python3.9[32600]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:30:30 np0005596062 python3.9[32756]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:30:31 np0005596062 python3.9[32908]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:30:32 np0005596062 python3.9[33058]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:30:35 np0005596062 python3.9[33311]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:30:36 np0005596062 python3.9[33461]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:30:37 np0005596062 python3.9[33615]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:30:39 np0005596062 python3.9[33773]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:30:39 np0005596062 python3.9[33857]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:31:27 np0005596062 systemd[1]: Reloading.
Jan 26 12:31:27 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:31:27 np0005596062 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 26 12:31:27 np0005596062 systemd[1]: Reloading.
Jan 26 12:31:27 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:31:28 np0005596062 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 26 12:31:28 np0005596062 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 26 12:31:28 np0005596062 systemd[1]: Reloading.
Jan 26 12:31:28 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:31:28 np0005596062 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 26 12:31:28 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:31:28 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:31:28 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:32:33 np0005596062 kernel: SELinux:  Converting 2725 SID table entries...
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:32:33 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:32:33 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 26 12:32:33 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:32:33 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:32:33 np0005596062 systemd[1]: Reloading.
Jan 26 12:32:33 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:32:33 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:32:34 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:32:34 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:32:34 np0005596062 systemd[1]: man-db-cache-update.service: Consumed 1.154s CPU time.
Jan 26 12:32:34 np0005596062 systemd[1]: run-rb9e36df3c541413e966257ae634626c6.service: Deactivated successfully.
Jan 26 12:32:35 np0005596062 python3.9[35372]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:32:37 np0005596062 python3.9[35653]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 12:32:38 np0005596062 python3.9[35805]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 12:32:41 np0005596062 python3.9[35960]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:32:44 np0005596062 python3.9[36112]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 12:32:49 np0005596062 python3.9[36264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:32:53 np0005596062 python3.9[36416]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:32:54 np0005596062 python3.9[36539]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769448770.443832-669-21578677991632/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:32:55 np0005596062 python3.9[36691]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:32:56 np0005596062 python3.9[36843]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:32:57 np0005596062 python3.9[36996]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:32:58 np0005596062 python3.9[37148]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 12:32:58 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 12:32:59 np0005596062 python3.9[37302]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 12:33:00 np0005596062 python3.9[37460]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 12:33:01 np0005596062 python3.9[37620]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 12:33:02 np0005596062 python3.9[37773]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 12:33:03 np0005596062 python3.9[37931]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 12:33:04 np0005596062 python3.9[38083]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:33:07 np0005596062 python3.9[38237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:33:08 np0005596062 python3.9[38389]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:33:09 np0005596062 python3.9[38512]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769448787.899761-1026-185440905867853/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:33:10 np0005596062 python3.9[38664]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:33:10 np0005596062 systemd[1]: Starting Load Kernel Modules...
Jan 26 12:33:10 np0005596062 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 26 12:33:10 np0005596062 kernel: Bridge firewalling registered
Jan 26 12:33:10 np0005596062 systemd-modules-load[38668]: Inserted module 'br_netfilter'
Jan 26 12:33:10 np0005596062 systemd[1]: Finished Load Kernel Modules.
Jan 26 12:33:11 np0005596062 python3.9[38824]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:33:11 np0005596062 python3.9[38947]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769448790.6331687-1095-80446182849379/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:33:12 np0005596062 python3.9[39099]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:33:15 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:33:15 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:33:16 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:33:16 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:33:16 np0005596062 systemd[1]: Reloading.
Jan 26 12:33:16 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:33:16 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:33:18 np0005596062 python3.9[41262]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:33:19 np0005596062 python3.9[42588]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 12:33:20 np0005596062 python3.9[43113]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:33:20 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:33:20 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:33:20 np0005596062 systemd[1]: man-db-cache-update.service: Consumed 5.322s CPU time.
Jan 26 12:33:20 np0005596062 systemd[1]: run-r0f5bad234ddb4db588a92f0c52805212.service: Deactivated successfully.
Jan 26 12:33:21 np0005596062 python3.9[43266]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:33:21 np0005596062 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 12:33:22 np0005596062 systemd[1]: Starting Authorization Manager...
Jan 26 12:33:22 np0005596062 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 12:33:22 np0005596062 polkitd[43483]: Started polkitd version 0.117
Jan 26 12:33:22 np0005596062 systemd[1]: Started Authorization Manager.
Jan 26 12:33:23 np0005596062 python3.9[43653]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:33:23 np0005596062 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 12:33:23 np0005596062 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 12:33:23 np0005596062 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 12:33:23 np0005596062 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 12:33:23 np0005596062 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 12:33:24 np0005596062 python3.9[43814]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 12:33:28 np0005596062 python3.9[43966]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:33:29 np0005596062 systemd[1]: Reloading.
Jan 26 12:33:29 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:33:30 np0005596062 python3.9[44155]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:33:30 np0005596062 systemd[1]: Reloading.
Jan 26 12:33:30 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:33:31 np0005596062 python3.9[44344]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:33:32 np0005596062 python3.9[44497]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:33:32 np0005596062 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 26 12:33:33 np0005596062 python3.9[44650]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:33:35 np0005596062 python3.9[44812]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:33:36 np0005596062 python3.9[44965]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:33:36 np0005596062 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 12:33:36 np0005596062 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 12:33:36 np0005596062 systemd[1]: Stopping Apply Kernel Variables...
Jan 26 12:33:36 np0005596062 systemd[1]: Starting Apply Kernel Variables...
Jan 26 12:33:36 np0005596062 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 12:33:36 np0005596062 systemd[1]: Finished Apply Kernel Variables.
Jan 26 12:33:37 np0005596062 systemd[1]: session-10.scope: Deactivated successfully.
Jan 26 12:33:37 np0005596062 systemd[1]: session-10.scope: Consumed 2min 19.670s CPU time.
Jan 26 12:33:37 np0005596062 systemd-logind[781]: Session 10 logged out. Waiting for processes to exit.
Jan 26 12:33:37 np0005596062 systemd-logind[781]: Removed session 10.
Jan 26 12:33:43 np0005596062 systemd-logind[781]: New session 11 of user zuul.
Jan 26 12:33:44 np0005596062 systemd[1]: Started Session 11 of User zuul.
Jan 26 12:33:45 np0005596062 python3.9[45148]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:33:46 np0005596062 python3.9[45304]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 12:33:47 np0005596062 python3.9[45457]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 12:33:48 np0005596062 python3.9[45615]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 12:33:49 np0005596062 python3.9[45775]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:33:50 np0005596062 python3.9[45859]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 12:33:54 np0005596062 python3.9[46022]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:34:04 np0005596062 kernel: SELinux:  Converting 2737 SID table entries...
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:34:04 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:34:05 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 26 12:34:05 np0005596062 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 26 12:34:06 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:34:06 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:34:06 np0005596062 systemd[1]: Reloading.
Jan 26 12:34:06 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:34:06 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:34:07 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:34:07 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:34:07 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:34:07 np0005596062 systemd[1]: run-r54a1cadbf45d48878e722f495945e61d.service: Deactivated successfully.
Jan 26 12:34:12 np0005596062 python3.9[47119]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:34:12 np0005596062 systemd[1]: Reloading.
Jan 26 12:34:13 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:34:13 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:34:13 np0005596062 systemd[1]: Starting Open vSwitch Database Unit...
Jan 26 12:34:13 np0005596062 chown[47161]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 26 12:34:13 np0005596062 ovs-ctl[47166]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 26 12:34:13 np0005596062 ovs-ctl[47166]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 26 12:34:13 np0005596062 ovs-ctl[47166]: Starting ovsdb-server [  OK  ]
Jan 26 12:34:13 np0005596062 ovs-vsctl[47216]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 26 12:34:13 np0005596062 ovs-vsctl[47236]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"9838f21e-c1ce-4cfa-829e-a12b9d657d8a\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 26 12:34:13 np0005596062 ovs-ctl[47166]: Configuring Open vSwitch system IDs [  OK  ]
Jan 26 12:34:13 np0005596062 ovs-vsctl[47242]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 26 12:34:13 np0005596062 ovs-ctl[47166]: Enabling remote OVSDB managers [  OK  ]
Jan 26 12:34:13 np0005596062 systemd[1]: Started Open vSwitch Database Unit.
Jan 26 12:34:13 np0005596062 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 26 12:34:13 np0005596062 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 26 12:34:13 np0005596062 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 26 12:34:13 np0005596062 kernel: openvswitch: Open vSwitch switching datapath
Jan 26 12:34:13 np0005596062 ovs-ctl[47286]: Inserting openvswitch module [  OK  ]
Jan 26 12:34:13 np0005596062 ovs-ctl[47255]: Starting ovs-vswitchd [  OK  ]
Jan 26 12:34:13 np0005596062 ovs-ctl[47255]: Enabling remote OVSDB managers [  OK  ]
Jan 26 12:34:13 np0005596062 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 26 12:34:13 np0005596062 ovs-vsctl[47304]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 26 12:34:13 np0005596062 systemd[1]: Starting Open vSwitch...
Jan 26 12:34:13 np0005596062 systemd[1]: Finished Open vSwitch.
Jan 26 12:34:14 np0005596062 python3.9[47455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:34:15 np0005596062 python3.9[47607]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 12:34:17 np0005596062 kernel: SELinux:  Converting 2751 SID table entries...
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:34:17 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:34:18 np0005596062 python3.9[47762]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:34:19 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 26 12:34:19 np0005596062 python3.9[47920]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:34:22 np0005596062 python3.9[48073]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:34:23 np0005596062 python3.9[48360]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 12:34:24 np0005596062 python3.9[48510]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:34:25 np0005596062 python3.9[48664]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:34:27 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:34:27 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:34:27 np0005596062 systemd[1]: Reloading.
Jan 26 12:34:27 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:34:27 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:34:27 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:34:27 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:34:27 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:34:27 np0005596062 systemd[1]: run-r11baf8704d89432d8510755beaca8a15.service: Deactivated successfully.
Jan 26 12:34:29 np0005596062 python3.9[48983]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:34:30 np0005596062 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 12:34:30 np0005596062 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 12:34:30 np0005596062 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 12:34:30 np0005596062 systemd[1]: Stopping Network Manager...
Jan 26 12:34:30 np0005596062 NetworkManager[7218]: <info>  [1769448870.4718] caught SIGTERM, shutting down normally.
Jan 26 12:34:30 np0005596062 NetworkManager[7218]: <info>  [1769448870.4754] dhcp4 (eth0): canceled DHCP transaction
Jan 26 12:34:30 np0005596062 NetworkManager[7218]: <info>  [1769448870.4755] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:34:30 np0005596062 NetworkManager[7218]: <info>  [1769448870.4755] dhcp4 (eth0): state changed no lease
Jan 26 12:34:30 np0005596062 NetworkManager[7218]: <info>  [1769448870.4765] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 12:34:30 np0005596062 NetworkManager[7218]: <info>  [1769448870.4860] exiting (success)
Jan 26 12:34:30 np0005596062 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 12:34:30 np0005596062 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 12:34:30 np0005596062 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 12:34:30 np0005596062 systemd[1]: Stopped Network Manager.
Jan 26 12:34:30 np0005596062 systemd[1]: NetworkManager.service: Consumed 12.723s CPU time, 4.1M memory peak, read 0B from disk, written 25.5K to disk.
Jan 26 12:34:30 np0005596062 systemd[1]: Starting Network Manager...
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.5442] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0500db80-16b3-49e6-bb63-ade1deb047ad)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.5445] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.5495] manager[0x55fcc7ba2000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 12:34:30 np0005596062 systemd[1]: Starting Hostname Service...
Jan 26 12:34:30 np0005596062 systemd[1]: Started Hostname Service.
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6481] hostname: hostname: using hostnamed
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6482] hostname: static hostname changed from (none) to "compute-2"
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6491] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6497] manager[0x55fcc7ba2000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6498] manager[0x55fcc7ba2000]: rfkill: WWAN hardware radio set enabled
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6536] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6553] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6555] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6556] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6558] manager: Networking is enabled by state file
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6561] settings: Loaded settings plugin: keyfile (internal)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6568] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6620] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6640] dhcp: init: Using DHCP client 'internal'
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6645] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6654] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6662] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6677] device (lo): Activation: starting connection 'lo' (f73eba9c-44a7-4e51-ab55-16d275cdfcc3)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6688] device (eth0): carrier: link connected
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6695] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6703] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6704] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6714] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6726] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6736] device (eth1): carrier: link connected
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6744] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6752] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (bae2aad0-b0a0-5029-8b43-136c50f17dfb) (indicated)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6753] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6764] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6779] device (eth1): Activation: starting connection 'ci-private-network' (bae2aad0-b0a0-5029-8b43-136c50f17dfb)
Jan 26 12:34:30 np0005596062 systemd[1]: Started Network Manager.
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6788] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6802] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6806] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6809] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6813] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6816] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6820] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6823] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6829] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6839] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6845] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6858] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6882] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6895] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6899] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6912] device (lo): Activation: successful, device activated.
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6925] dhcp4 (eth0): state changed new lease, address=38.102.83.190
Jan 26 12:34:30 np0005596062 systemd[1]: Starting Network Manager Wait Online...
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.6940] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7029] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7037] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7047] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7053] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7058] device (eth1): Activation: successful, device activated.
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7072] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7074] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7081] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7084] device (eth0): Activation: successful, device activated.
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7092] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 12:34:30 np0005596062 NetworkManager[48993]: <info>  [1769448870.7098] manager: startup complete
Jan 26 12:34:30 np0005596062 systemd[1]: Finished Network Manager Wait Online.
Jan 26 12:34:31 np0005596062 python3.9[49209]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:34:37 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:34:37 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:34:37 np0005596062 systemd[1]: Reloading.
Jan 26 12:34:37 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:34:37 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:34:37 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:34:38 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:34:38 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:34:38 np0005596062 systemd[1]: run-r6823c8b59dea4b1f9e4844758d744769.service: Deactivated successfully.
Jan 26 12:34:40 np0005596062 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 12:34:43 np0005596062 python3.9[49670]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:34:44 np0005596062 python3.9[49822]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:45 np0005596062 python3.9[49976]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:46 np0005596062 python3.9[50128]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:46 np0005596062 python3.9[50280]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:47 np0005596062 python3.9[50432]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:48 np0005596062 python3.9[50584]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:34:49 np0005596062 python3.9[50707]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448887.943376-649-193291639115042/.source _original_basename=.3dquja5j follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:50 np0005596062 python3.9[50859]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:50 np0005596062 python3.9[51011]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 26 12:34:51 np0005596062 python3.9[51163]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:34:53 np0005596062 python3.9[51590]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 26 12:34:55 np0005596062 ansible-async_wrapper.py[51765]: Invoked with j731008907291 300 /home/zuul/.ansible/tmp/ansible-tmp-1769448894.245567-847-181850480250213/AnsiballZ_edpm_os_net_config.py _
Jan 26 12:34:55 np0005596062 ansible-async_wrapper.py[51768]: Starting module and watcher
Jan 26 12:34:55 np0005596062 ansible-async_wrapper.py[51768]: Start watching 51769 (300)
Jan 26 12:34:55 np0005596062 ansible-async_wrapper.py[51769]: Start module (51769)
Jan 26 12:34:55 np0005596062 ansible-async_wrapper.py[51765]: Return async_wrapper task started.
Jan 26 12:34:55 np0005596062 python3.9[51770]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 26 12:34:56 np0005596062 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 26 12:34:56 np0005596062 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 26 12:34:56 np0005596062 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 26 12:34:56 np0005596062 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 26 12:34:56 np0005596062 kernel: cfg80211: failed to load regulatory.db
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3042] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3061] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3541] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3542] audit: op="connection-add" uuid="e00fa3ab-3632-46dc-add2-0e0094ab2e88" name="br-ex-br" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3559] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3560] audit: op="connection-add" uuid="a50ce932-b350-435a-8555-2031ac43b37d" name="br-ex-port" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3575] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3576] audit: op="connection-add" uuid="c0badcd1-cdc2-4a92-84cd-8699d60fef37" name="eth1-port" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3591] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3593] audit: op="connection-add" uuid="53e914d5-95f8-4d3f-9c1b-28c8cfe32991" name="vlan20-port" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3606] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3607] audit: op="connection-add" uuid="cc1dfd79-0564-4020-9bd9-5ba6a6afa862" name="vlan21-port" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3622] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3623] audit: op="connection-add" uuid="9e605ba4-3e53-4198-9de3-cc84d51d3b5e" name="vlan22-port" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3636] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3638] audit: op="connection-add" uuid="f656b8eb-db2e-4fb7-a880-2305f266f0ae" name="vlan23-port" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3661] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3679] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3680] audit: op="connection-add" uuid="317157bb-b98b-4d8f-a4ed-d7ab87bdb8a4" name="br-ex-if" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3727] audit: op="connection-update" uuid="bae2aad0-b0a0-5029-8b43-136c50f17dfb" name="ci-private-network" args="ipv4.addresses,ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.method,ipv4.never-default,ovs-external-ids.data,connection.master,connection.port-type,connection.controller,connection.slave-type,connection.timestamp,ovs-interface.type,ipv6.addresses,ipv6.dns,ipv6.addr-gen-mode,ipv6.routes,ipv6.routing-rules,ipv6.method" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3749] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3750] audit: op="connection-add" uuid="add94f74-71eb-4aa6-a4ce-6ae849c8a6a2" name="vlan20-if" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3771] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3772] audit: op="connection-add" uuid="78dddd8a-468c-499c-95cf-f25630cd3bfc" name="vlan21-if" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3793] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3796] audit: op="connection-add" uuid="3d678b45-3052-444d-85ce-ccf5a072c031" name="vlan22-if" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3818] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3820] audit: op="connection-add" uuid="729aa61b-0982-45da-ba36-7f27b17cd5df" name="vlan23-if" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3834] audit: op="connection-delete" uuid="df5cc3c6-f91b-32fd-9da3-aee41bff0f12" name="Wired connection 1" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3852] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3854] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3864] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3868] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e00fa3ab-3632-46dc-add2-0e0094ab2e88)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3869] audit: op="connection-activate" uuid="e00fa3ab-3632-46dc-add2-0e0094ab2e88" name="br-ex-br" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3870] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3871] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3876] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3881] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (a50ce932-b350-435a-8555-2031ac43b37d)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3882] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3883] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3888] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3892] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (c0badcd1-cdc2-4a92-84cd-8699d60fef37)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3894] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3895] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3900] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3904] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (53e914d5-95f8-4d3f-9c1b-28c8cfe32991)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3906] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3907] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3912] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3916] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (cc1dfd79-0564-4020-9bd9-5ba6a6afa862)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3918] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3919] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3924] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3928] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (9e605ba4-3e53-4198-9de3-cc84d51d3b5e)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3930] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3931] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3936] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3941] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f656b8eb-db2e-4fb7-a880-2305f266f0ae)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3941] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3944] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3946] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3952] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3953] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3956] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3961] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (317157bb-b98b-4d8f-a4ed-d7ab87bdb8a4)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3961] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3964] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3966] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3967] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3969] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3980] device (eth1): disconnecting for new activation request.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3981] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3983] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3985] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3987] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3989] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.3990] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3993] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3997] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (add94f74-71eb-4aa6-a4ce-6ae849c8a6a2)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.3998] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4001] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4002] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4003] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4006] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.4007] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4010] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4014] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (78dddd8a-468c-499c-95cf-f25630cd3bfc)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4015] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4018] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4019] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4021] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4023] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.4024] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4027] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4032] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (3d678b45-3052-444d-85ce-ccf5a072c031)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4032] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4035] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4037] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4038] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4041] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <warn>  [1769448897.4042] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4045] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4049] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (729aa61b-0982-45da-ba36-7f27b17cd5df)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4050] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4053] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4054] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4056] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4057] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4072] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4074] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4077] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4079] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4086] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4089] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4092] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4095] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4097] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4102] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4107] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4111] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4112] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 kernel: ovs-system: entered promiscuous mode
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4119] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4123] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4126] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 systemd-udevd[51776]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 12:34:57 np0005596062 kernel: Timeout policy base is empty
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4128] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4134] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4139] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4142] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4143] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4149] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4154] dhcp4 (eth0): canceled DHCP transaction
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4154] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4154] dhcp4 (eth0): state changed no lease
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4156] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4169] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4172] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51771 uid=0 result="fail" reason="Device is not activated"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4214] device (eth1): disconnecting for new activation request.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4215] audit: op="connection-activate" uuid="bae2aad0-b0a0-5029-8b43-136c50f17dfb" name="ci-private-network" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4222] dhcp4 (eth0): state changed new lease, address=38.102.83.190
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4261] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4268] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51771 uid=0 result="success"
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4285] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4295] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4302] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4325] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4444] device (eth1): Activation: starting connection 'ci-private-network' (bae2aad0-b0a0-5029-8b43-136c50f17dfb)
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4448] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4462] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4466] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4472] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4474] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4478] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4479] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4480] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4482] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4483] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4483] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 kernel: br-ex: entered promiscuous mode
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4498] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4508] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4511] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4515] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4518] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4521] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4524] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4527] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4530] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4533] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4536] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4539] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4543] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4556] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4561] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4596] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4598] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4604] device (eth1): Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 kernel: vlan22: entered promiscuous mode
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4632] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 26 12:34:57 np0005596062 systemd-udevd[51775]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4653] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4669] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4671] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4675] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 kernel: vlan23: entered promiscuous mode
Jan 26 12:34:57 np0005596062 kernel: vlan21: entered promiscuous mode
Jan 26 12:34:57 np0005596062 systemd-udevd[51777]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4752] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4763] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4781] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4782] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4787] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4803] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4821] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 kernel: vlan20: entered promiscuous mode
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4859] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4862] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4867] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4882] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4897] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4933] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4935] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4940] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4980] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.4992] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.5007] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.5010] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 12:34:57 np0005596062 NetworkManager[48993]: <info>  [1769448897.5015] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 12:34:58 np0005596062 NetworkManager[48993]: <info>  [1769448898.6089] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51771 uid=0 result="success"
Jan 26 12:34:59 np0005596062 NetworkManager[48993]: <info>  [1769448899.8505] checkpoint[0x55fcc7b77950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 26 12:34:59 np0005596062 NetworkManager[48993]: <info>  [1769448899.8517] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51771 uid=0 result="success"
Jan 26 12:34:59 np0005596062 python3.9[52129]: ansible-ansible.legacy.async_status Invoked with jid=j731008907291.51765 mode=status _async_dir=/root/.ansible_async
Jan 26 12:35:00 np0005596062 ansible-async_wrapper.py[51768]: 51769 still running (300)
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.1623] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51771 uid=0 result="success"
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.1632] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51771 uid=0 result="success"
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.3718] audit: op="networking-control" arg="global-dns-configuration" pid=51771 uid=0 result="success"
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.3746] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.3774] audit: op="networking-control" arg="global-dns-configuration" pid=51771 uid=0 result="success"
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.3791] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51771 uid=0 result="success"
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.5159] checkpoint[0x55fcc7b77a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 26 12:35:00 np0005596062 NetworkManager[48993]: <info>  [1769448900.5163] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51771 uid=0 result="success"
Jan 26 12:35:00 np0005596062 ansible-async_wrapper.py[51769]: Module complete (51769)
Jan 26 12:35:00 np0005596062 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 12:35:03 np0005596062 python3.9[52237]: ansible-ansible.legacy.async_status Invoked with jid=j731008907291.51765 mode=status _async_dir=/root/.ansible_async
Jan 26 12:35:03 np0005596062 python3.9[52336]: ansible-ansible.legacy.async_status Invoked with jid=j731008907291.51765 mode=cleanup _async_dir=/root/.ansible_async
Jan 26 12:35:04 np0005596062 python3.9[52488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:35:05 np0005596062 ansible-async_wrapper.py[51768]: Done in kid B.
Jan 26 12:35:05 np0005596062 python3.9[52611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448904.1198008-928-195641608627316/.source.returncode _original_basename=.83um5lu2 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:35:05 np0005596062 python3.9[52764]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:35:06 np0005596062 python3.9[52887]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448905.4511118-976-98191231552229/.source.cfg _original_basename=.jqztljr7 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:35:07 np0005596062 python3.9[53039]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:35:08 np0005596062 systemd[1]: Reloading Network Manager...
Jan 26 12:35:08 np0005596062 NetworkManager[48993]: <info>  [1769448908.2604] audit: op="reload" arg="0" pid=53043 uid=0 result="success"
Jan 26 12:35:08 np0005596062 NetworkManager[48993]: <info>  [1769448908.2614] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 26 12:35:08 np0005596062 systemd[1]: Reloaded Network Manager.
Jan 26 12:35:08 np0005596062 systemd[1]: session-11.scope: Deactivated successfully.
Jan 26 12:35:08 np0005596062 systemd[1]: session-11.scope: Consumed 51.068s CPU time.
Jan 26 12:35:08 np0005596062 systemd-logind[781]: Session 11 logged out. Waiting for processes to exit.
Jan 26 12:35:08 np0005596062 systemd-logind[781]: Removed session 11.
Jan 26 12:35:13 np0005596062 systemd-logind[781]: New session 12 of user zuul.
Jan 26 12:35:13 np0005596062 systemd[1]: Started Session 12 of User zuul.
Jan 26 12:35:14 np0005596062 python3.9[53227]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:35:15 np0005596062 python3.9[53382]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:35:17 np0005596062 python3.9[53575]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:35:17 np0005596062 systemd[1]: session-12.scope: Deactivated successfully.
Jan 26 12:35:17 np0005596062 systemd[1]: session-12.scope: Consumed 2.611s CPU time.
Jan 26 12:35:17 np0005596062 systemd-logind[781]: Session 12 logged out. Waiting for processes to exit.
Jan 26 12:35:17 np0005596062 systemd-logind[781]: Removed session 12.
Jan 26 12:35:18 np0005596062 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 12:35:22 np0005596062 systemd-logind[781]: New session 13 of user zuul.
Jan 26 12:35:22 np0005596062 systemd[1]: Started Session 13 of User zuul.
Jan 26 12:35:23 np0005596062 python3.9[53757]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:35:24 np0005596062 python3.9[53911]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:35:25 np0005596062 python3.9[54068]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:35:26 np0005596062 python3.9[54152]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:35:28 np0005596062 python3.9[54306]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:35:30 np0005596062 python3.9[54501]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:35:31 np0005596062 python3.9[54653]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:35:31 np0005596062 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck197682330-merged.mount: Deactivated successfully.
Jan 26 12:35:31 np0005596062 podman[54654]: 2026-01-26 17:35:31.529507576 +0000 UTC m=+0.069763021 system refresh
Jan 26 12:35:32 np0005596062 python3.9[54816]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:35:32 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:35:33 np0005596062 python3.9[54939]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769448931.8447232-199-182932780321140/.source.json follow=False _original_basename=podman_network_config.j2 checksum=b9930c495b0c91f69b00e4959cd149e979402b2d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:35:33 np0005596062 python3.9[55091]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:35:34 np0005596062 python3.9[55214]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769448933.3485966-244-167043302472652/.source.conf follow=False _original_basename=registries.conf.j2 checksum=d562ec5932fcff7c51e03321842af205a2feb813 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:35:35 np0005596062 python3.9[55366]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:35:35 np0005596062 python3.9[55518]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:35:36 np0005596062 python3.9[55670]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:35:37 np0005596062 python3.9[55822]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:35:38 np0005596062 python3.9[55974]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:35:40 np0005596062 python3.9[56127]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:35:41 np0005596062 python3.9[56281]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:35:42 np0005596062 python3.9[56433]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:35:43 np0005596062 python3.9[56585]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:35:44 np0005596062 python3.9[56738]: ansible-service_facts Invoked
Jan 26 12:35:44 np0005596062 network[56755]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:35:44 np0005596062 network[56756]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:35:44 np0005596062 network[56757]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:35:52 np0005596062 python3.9[57209]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:35:55 np0005596062 python3.9[57362]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 12:35:56 np0005596062 python3.9[57514]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:35:57 np0005596062 python3.9[57639]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448955.9765635-677-161120018478950/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:35:58 np0005596062 python3.9[57793]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:35:58 np0005596062 python3.9[57918]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448957.5298653-722-82439365569287/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:00 np0005596062 python3.9[58072]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:02 np0005596062 python3.9[58226]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:36:03 np0005596062 python3.9[58312]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:36:05 np0005596062 python3.9[58466]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:36:06 np0005596062 python3.9[58550]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:36:06 np0005596062 chronyd[783]: chronyd exiting
Jan 26 12:36:06 np0005596062 systemd[1]: Stopping NTP client/server...
Jan 26 12:36:06 np0005596062 systemd[1]: chronyd.service: Deactivated successfully.
Jan 26 12:36:06 np0005596062 systemd[1]: Stopped NTP client/server.
Jan 26 12:36:06 np0005596062 systemd[1]: Starting NTP client/server...
Jan 26 12:36:06 np0005596062 chronyd[58558]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 12:36:06 np0005596062 chronyd[58558]: Frequency -26.805 +/- 0.402 ppm read from /var/lib/chrony/drift
Jan 26 12:36:06 np0005596062 chronyd[58558]: Loaded seccomp filter (level 2)
Jan 26 12:36:06 np0005596062 systemd[1]: Started NTP client/server.
Jan 26 12:36:07 np0005596062 systemd[1]: session-13.scope: Deactivated successfully.
Jan 26 12:36:07 np0005596062 systemd[1]: session-13.scope: Consumed 28.840s CPU time.
Jan 26 12:36:07 np0005596062 systemd-logind[781]: Session 13 logged out. Waiting for processes to exit.
Jan 26 12:36:07 np0005596062 systemd-logind[781]: Removed session 13.
Jan 26 12:36:12 np0005596062 systemd-logind[781]: New session 14 of user zuul.
Jan 26 12:36:12 np0005596062 systemd[1]: Started Session 14 of User zuul.
Jan 26 12:36:13 np0005596062 python3.9[58739]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:14 np0005596062 python3.9[58891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:14 np0005596062 python3.9[59014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448973.3768609-64-187424567797251/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:15 np0005596062 systemd[1]: session-14.scope: Deactivated successfully.
Jan 26 12:36:15 np0005596062 systemd[1]: session-14.scope: Consumed 1.990s CPU time.
Jan 26 12:36:15 np0005596062 systemd-logind[781]: Session 14 logged out. Waiting for processes to exit.
Jan 26 12:36:15 np0005596062 systemd-logind[781]: Removed session 14.
Jan 26 12:36:20 np0005596062 systemd-logind[781]: New session 15 of user zuul.
Jan 26 12:36:20 np0005596062 systemd[1]: Started Session 15 of User zuul.
Jan 26 12:36:21 np0005596062 python3.9[59192]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:36:22 np0005596062 python3.9[59348]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:23 np0005596062 python3.9[59523]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:24 np0005596062 python3.9[59646]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769448982.636953-85-8859763907874/.source.json _original_basename=.sx6bfstc follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:25 np0005596062 python3.9[59798]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:25 np0005596062 python3.9[59921]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769448984.628796-154-24722624959927/.source _original_basename=.rsawho5r follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:26 np0005596062 python3.9[60073]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:36:27 np0005596062 python3.9[60225]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:27 np0005596062 python3.9[60348]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769448986.7506182-227-146426766668634/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:36:28 np0005596062 python3.9[60500]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:29 np0005596062 python3.9[60623]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769448988.0314543-227-185773344356762/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:36:30 np0005596062 python3.9[60775]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:31 np0005596062 python3.9[60927]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:31 np0005596062 python3.9[61050]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769448990.6041934-338-259179064152913/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:32 np0005596062 python3.9[61202]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:33 np0005596062 python3.9[61325]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769448991.9890585-383-174945986760270/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:34 np0005596062 python3.9[61477]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:36:34 np0005596062 systemd[1]: Reloading.
Jan 26 12:36:34 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:36:34 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:36:34 np0005596062 systemd[1]: Reloading.
Jan 26 12:36:34 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:36:34 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:36:34 np0005596062 systemd[1]: Starting EDPM Container Shutdown...
Jan 26 12:36:34 np0005596062 systemd[1]: Finished EDPM Container Shutdown.
Jan 26 12:36:35 np0005596062 python3.9[61705]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:36 np0005596062 python3.9[61828]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769448995.1105485-451-181009734507471/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:37 np0005596062 python3.9[61980]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:37 np0005596062 python3.9[62103]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769448996.486309-497-123989314088118/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:38 np0005596062 python3.9[62255]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:36:38 np0005596062 systemd[1]: Reloading.
Jan 26 12:36:38 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:36:38 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:36:38 np0005596062 systemd[1]: Reloading.
Jan 26 12:36:38 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:36:38 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:36:39 np0005596062 systemd[1]: Starting Create netns directory...
Jan 26 12:36:39 np0005596062 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 12:36:39 np0005596062 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 12:36:39 np0005596062 systemd[1]: Finished Create netns directory.
Jan 26 12:36:40 np0005596062 python3.9[62481]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:36:40 np0005596062 network[62498]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:36:40 np0005596062 network[62499]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:36:40 np0005596062 network[62500]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:36:44 np0005596062 python3.9[62762]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:36:44 np0005596062 systemd[1]: Reloading.
Jan 26 12:36:44 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:36:44 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:36:44 np0005596062 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 26 12:36:44 np0005596062 iptables.init[62803]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 26 12:36:44 np0005596062 iptables.init[62803]: iptables: Flushing firewall rules: [  OK  ]
Jan 26 12:36:44 np0005596062 systemd[1]: iptables.service: Deactivated successfully.
Jan 26 12:36:44 np0005596062 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 26 12:36:45 np0005596062 python3.9[62999]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:36:46 np0005596062 python3.9[63153]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:36:47 np0005596062 systemd[1]: Reloading.
Jan 26 12:36:47 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:36:47 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:36:47 np0005596062 systemd[1]: Starting Netfilter Tables...
Jan 26 12:36:47 np0005596062 systemd[1]: Finished Netfilter Tables.
Jan 26 12:36:48 np0005596062 python3.9[63345]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:36:49 np0005596062 python3.9[63498]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:50 np0005596062 python3.9[63623]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449009.3414497-705-278841095300294/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:51 np0005596062 python3.9[63776]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:36:51 np0005596062 systemd[1]: Reloading OpenSSH server daemon...
Jan 26 12:36:51 np0005596062 systemd[1]: Reloaded OpenSSH server daemon.
Jan 26 12:36:52 np0005596062 python3.9[63932]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:53 np0005596062 python3.9[64084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:53 np0005596062 python3.9[64207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449012.6227577-797-8057288497794/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:55 np0005596062 python3.9[64359]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 12:36:55 np0005596062 systemd[1]: Starting Time & Date Service...
Jan 26 12:36:55 np0005596062 systemd[1]: Started Time & Date Service.
Jan 26 12:36:55 np0005596062 python3.9[64515]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:56 np0005596062 python3.9[64667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:57 np0005596062 python3.9[64790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449016.1153255-902-238748659577286/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:58 np0005596062 python3.9[64942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:36:58 np0005596062 python3.9[65065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449017.5301287-947-225267528201580/.source.yaml _original_basename=.3hvv95a2 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:36:59 np0005596062 python3.9[65217]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:37:00 np0005596062 python3.9[65340]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449018.937647-992-168317250415966/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:01 np0005596062 python3.9[65492]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:01 np0005596062 python3.9[65645]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:02 np0005596062 python3[65798]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 12:37:03 np0005596062 python3.9[65950]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:37:04 np0005596062 python3.9[66073]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449022.9913096-1109-113665995624382/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:04 np0005596062 python3.9[66225]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:37:05 np0005596062 python3.9[66348]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449024.4369087-1154-48306078428997/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:06 np0005596062 python3.9[66500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:37:06 np0005596062 python3.9[66623]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449025.7742066-1199-1590618029630/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:07 np0005596062 python3.9[66775]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:37:08 np0005596062 python3.9[66898]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449027.1571033-1244-198909187757303/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:09 np0005596062 python3.9[67050]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:37:09 np0005596062 python3.9[67173]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449028.643308-1289-48084118763657/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:10 np0005596062 python3.9[67325]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:11 np0005596062 python3.9[67477]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:12 np0005596062 python3.9[67636]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:13 np0005596062 python3.9[67789]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:13 np0005596062 python3.9[67941]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:14 np0005596062 python3.9[68094]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 12:37:15 np0005596062 python3.9[68247]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 12:37:16 np0005596062 systemd[1]: session-15.scope: Deactivated successfully.
Jan 26 12:37:16 np0005596062 systemd[1]: session-15.scope: Consumed 41.658s CPU time.
Jan 26 12:37:16 np0005596062 systemd-logind[781]: Session 15 logged out. Waiting for processes to exit.
Jan 26 12:37:16 np0005596062 systemd-logind[781]: Removed session 15.
Jan 26 12:37:21 np0005596062 systemd-logind[781]: New session 16 of user zuul.
Jan 26 12:37:21 np0005596062 systemd[1]: Started Session 16 of User zuul.
Jan 26 12:37:22 np0005596062 python3.9[68428]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 12:37:23 np0005596062 python3.9[68580]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:37:24 np0005596062 python3.9[68732]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:37:25 np0005596062 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 12:37:25 np0005596062 python3.9[68886]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0zaLI2LTbNOyYJLCkCHwBNvCbWxyjbFipOdeKx9WVOSI6BraalDHlRpumUYDm8JC8abEq1qaZCBLmxjPXdZu5OGr/kPmf6SKEUmhy4iVIlqya8lpE59ci/zJO3FmNG+BncaGfJAQ0wqUgfNc/27u/wxD+gMrd6Ocz1dRHjtV22N4KnHAZP+sb0G1LZUx4WhJ07B4r/YaWeXOL2puHk0zHfnxSMIyyEvTlx9zlqSArxDuyq6AA7skTmkIlIC7eYbws7R3oP5PdtDl0sj1SEaTS4uAOSxbcYCV3H/IBa5evA+pxo7m3gf2YQ/QsGcfMQF4GefF3pWfZN0BGK7DWb3bckv62Oq9geYx47ccajXIEt3vsncvsrZhozX5OPyxW4eLJ8r7ovCX+5uGTuF9LrmwDdc7XRJ7rXBWSKh66/yxUcPGEQIk7OoEA30ZmKeipyMJQHHrWKxAqkqz6+ZQ41KvXaFIB1lRQf4tlFTAfrm9xwChyoCfrU95QYM4V+zqCQ6E=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILbMiL3+EkWDKAQHi9JT5Xqvk8rNrdT5SVX2Gg2RyqsV#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLVPallz3Z+vrxzfd9Dxuo/G10ZpIDOna2ftaoWWaEiUQrn77C3vB8d1zHHnHxMi8qaS4W4lfA32FenhGfBnVVU=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC8WJSSyps5/MOwluaYVKvHLbB3OOMaGha+S5zKQqPSAcedSyuyvzK3GC+qad2ZbcfCfiNZHWM+ylBueRDL14BxpBXCAqNKHN1Yo1Fvlb4JCkcbhbgkVGemDEsbBiNmTtSlxRI40uI8M0+E42b22Zh7qz1PC1XmS0po5y6SwzcfgbnZtuyVFsvGHqDWkkWV/gsjiZ57qMaC+DJaIhvfW+qObinKJqXeuPQbF6yjfhXPHf2nwYEGY9rM5zEvZyfC/Dnrg62lDFjq4LGLrb83ipcBQq+zMejeECDs/u6noWAMs8f5HcxW0zembv86K5pOtPJKA13xVImv+kfGS+EctaKEBB/ooqOhN9AdXFEJUuSDn/2iUm07NnrEN9WhrfiuxLCO/lBWwxFGKcQECRviuCwE51F4fVEduv4ZiDgPcsHo+fYbxXsG50xc8/Yumd+a60pkpu09wVk1P3fCbFbRd9kD4elm067blILF+Zs+YuWnuaK3LiCb+qzmDKQB4AArubE=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILePT0ow4c3ejDoUzP/5T/dIHfr1xTtwEP/2z/Lf68vz#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB0tLEQbxQsuF0gTFyU7HBbMRjNrt7rMl1+QXcK3yfs0Q29raINYHrTVwzWeSuTUiO464HBZr4aPyLzhd+2Z3xs=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCOz38rnMu5RPID5R9a4AOkL2Ge6a4dzWxjmOZKIbuidITYge9lyZ+ThI161k8ZELWw9SBoQvNwVmySyCRLJH9qPhNCVmEqUqZJohUEZQ+lNpyZk3JkhZsgLTYjkdV/DPqp3iLlV/asPhl18j+CFKmN5Dx0qMsAg1f9CbOZwhdgeVEeB3IqdjBrPIMgAwVlacU9ty90SAUJj+RoMZePfAh7i2q7VTPHcvKRA1Mz4Q+RRKojI3DfR0se9vFL9KYNhD/O0JbAZksdom7tVuZ6LjcyIYqBUeB2jYwSO66sVFNWI4JwFEr5OOb1EiOGWGudWuZVfdeD+TYeZk0hco2GhtmXBVDWWeYQNNXAKRcQ7aM2y9SlN6gOKzJq08LuoShMOl8IuErTDV7Cp3WpuPPqDc5gv0swDVoOXsbju1Bxm2aLE7d1GiJbuhLS+pvIgc0MrnyOhUrTGTAdyfZ4gsw6BekK5Gf22C6xvZ865/N5LCr5jahKtqujZ6X6sECNsBQ1j0M=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOdmNmdvqfqzPDx4l6nvkEw8mwn78xc6LydRgAb6QEGT#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKb0RFR0G0BOVptSrXD3m/y/AD2q+whTWANps4FtvEcdq4zrHxHJM7JO/mkAyT4VEcyt7wmguNEWF5NqwEZeFZ4=#012 create=True mode=0644 path=/tmp/ansible.jkdrhs5m state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:26 np0005596062 python3.9[69038]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.jkdrhs5m' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:27 np0005596062 python3.9[69192]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.jkdrhs5m state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:27 np0005596062 systemd[1]: session-16.scope: Deactivated successfully.
Jan 26 12:37:27 np0005596062 systemd[1]: session-16.scope: Consumed 4.265s CPU time.
Jan 26 12:37:27 np0005596062 systemd-logind[781]: Session 16 logged out. Waiting for processes to exit.
Jan 26 12:37:27 np0005596062 systemd-logind[781]: Removed session 16.
Jan 26 12:37:33 np0005596062 systemd-logind[781]: New session 17 of user zuul.
Jan 26 12:37:33 np0005596062 systemd[1]: Started Session 17 of User zuul.
Jan 26 12:37:34 np0005596062 python3.9[69370]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:37:36 np0005596062 python3.9[69526]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 12:37:36 np0005596062 python3.9[69680]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:37:38 np0005596062 python3.9[69833]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:38 np0005596062 python3.9[69986]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:37:39 np0005596062 python3.9[70141]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:40 np0005596062 python3.9[70296]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:37:40 np0005596062 systemd[1]: session-17.scope: Deactivated successfully.
Jan 26 12:37:40 np0005596062 systemd[1]: session-17.scope: Consumed 4.915s CPU time.
Jan 26 12:37:40 np0005596062 systemd-logind[781]: Session 17 logged out. Waiting for processes to exit.
Jan 26 12:37:41 np0005596062 systemd-logind[781]: Removed session 17.
Jan 26 12:37:45 np0005596062 systemd-logind[781]: New session 18 of user zuul.
Jan 26 12:37:45 np0005596062 systemd[1]: Started Session 18 of User zuul.
Jan 26 12:37:47 np0005596062 python3.9[70474]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:37:48 np0005596062 python3.9[70630]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:37:49 np0005596062 python3.9[70714]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 12:37:51 np0005596062 python3.9[70865]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:37:52 np0005596062 python3.9[71016]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 12:37:53 np0005596062 python3.9[71166]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:37:53 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 12:37:53 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 12:37:54 np0005596062 python3.9[71317]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:37:54 np0005596062 systemd[1]: session-18.scope: Deactivated successfully.
Jan 26 12:37:54 np0005596062 systemd[1]: session-18.scope: Consumed 6.466s CPU time.
Jan 26 12:37:54 np0005596062 systemd-logind[781]: Session 18 logged out. Waiting for processes to exit.
Jan 26 12:37:54 np0005596062 systemd-logind[781]: Removed session 18.
Jan 26 12:38:03 np0005596062 systemd-logind[781]: New session 19 of user zuul.
Jan 26 12:38:03 np0005596062 systemd[1]: Started Session 19 of User zuul.
Jan 26 12:38:09 np0005596062 python3[72083]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:38:11 np0005596062 python3[72178]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 12:38:13 np0005596062 python3[72205]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 12:38:13 np0005596062 python3[72231]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:38:13 np0005596062 kernel: loop: module loaded
Jan 26 12:38:13 np0005596062 kernel: loop3: detected capacity change from 0 to 14680064
Jan 26 12:38:14 np0005596062 python3[72266]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:38:14 np0005596062 lvm[72269]: PV /dev/loop3 not used.
Jan 26 12:38:14 np0005596062 lvm[72278]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 12:38:14 np0005596062 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 26 12:38:14 np0005596062 lvm[72280]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 26 12:38:14 np0005596062 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 26 12:38:14 np0005596062 python3[72359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 12:38:15 np0005596062 python3[72432]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769449094.6815884-36957-184983198886587/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:38:16 np0005596062 python3[72482]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:38:16 np0005596062 systemd[1]: Reloading.
Jan 26 12:38:16 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:38:16 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:38:16 np0005596062 systemd[1]: Starting Ceph OSD losetup...
Jan 26 12:38:16 np0005596062 bash[72522]: /dev/loop3: [64513]:4328903 (/var/lib/ceph-osd-0.img)
Jan 26 12:38:16 np0005596062 systemd[1]: Finished Ceph OSD losetup.
Jan 26 12:38:16 np0005596062 lvm[72523]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 12:38:16 np0005596062 lvm[72523]: VG ceph_vg0 finished
Jan 26 12:38:16 np0005596062 chronyd[58558]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 26 12:38:19 np0005596062 python3[72547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:40:25 np0005596062 systemd[1]: Created slice User Slice of UID 42477.
Jan 26 12:40:25 np0005596062 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 26 12:40:25 np0005596062 systemd-logind[781]: New session 20 of user ceph-admin.
Jan 26 12:40:25 np0005596062 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 26 12:40:25 np0005596062 systemd[1]: Starting User Manager for UID 42477...
Jan 26 12:40:25 np0005596062 systemd[72598]: Queued start job for default target Main User Target.
Jan 26 12:40:25 np0005596062 systemd-logind[781]: New session 22 of user ceph-admin.
Jan 26 12:40:25 np0005596062 systemd[72598]: Created slice User Application Slice.
Jan 26 12:40:25 np0005596062 systemd[72598]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 12:40:25 np0005596062 systemd[72598]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 12:40:25 np0005596062 systemd[72598]: Reached target Paths.
Jan 26 12:40:25 np0005596062 systemd[72598]: Reached target Timers.
Jan 26 12:40:25 np0005596062 systemd[72598]: Starting D-Bus User Message Bus Socket...
Jan 26 12:40:25 np0005596062 systemd[72598]: Starting Create User's Volatile Files and Directories...
Jan 26 12:40:25 np0005596062 systemd[72598]: Listening on D-Bus User Message Bus Socket.
Jan 26 12:40:25 np0005596062 systemd[72598]: Reached target Sockets.
Jan 26 12:40:25 np0005596062 systemd[72598]: Finished Create User's Volatile Files and Directories.
Jan 26 12:40:25 np0005596062 systemd[72598]: Reached target Basic System.
Jan 26 12:40:25 np0005596062 systemd[72598]: Reached target Main User Target.
Jan 26 12:40:25 np0005596062 systemd[72598]: Startup finished in 132ms.
Jan 26 12:40:25 np0005596062 systemd[1]: Started User Manager for UID 42477.
Jan 26 12:40:25 np0005596062 systemd[1]: Started Session 20 of User ceph-admin.
Jan 26 12:40:25 np0005596062 systemd[1]: Started Session 22 of User ceph-admin.
Jan 26 12:40:25 np0005596062 systemd-logind[781]: New session 23 of user ceph-admin.
Jan 26 12:40:26 np0005596062 systemd[1]: Started Session 23 of User ceph-admin.
Jan 26 12:40:26 np0005596062 systemd-logind[781]: New session 24 of user ceph-admin.
Jan 26 12:40:26 np0005596062 systemd[1]: Started Session 24 of User ceph-admin.
Jan 26 12:40:26 np0005596062 systemd-logind[781]: New session 25 of user ceph-admin.
Jan 26 12:40:26 np0005596062 systemd[1]: Started Session 25 of User ceph-admin.
Jan 26 12:40:27 np0005596062 systemd-logind[781]: New session 26 of user ceph-admin.
Jan 26 12:40:27 np0005596062 systemd[1]: Started Session 26 of User ceph-admin.
Jan 26 12:40:27 np0005596062 systemd-logind[781]: New session 27 of user ceph-admin.
Jan 26 12:40:27 np0005596062 systemd[1]: Started Session 27 of User ceph-admin.
Jan 26 12:40:28 np0005596062 systemd-logind[781]: New session 28 of user ceph-admin.
Jan 26 12:40:28 np0005596062 systemd[1]: Started Session 28 of User ceph-admin.
Jan 26 12:40:28 np0005596062 systemd-logind[781]: New session 29 of user ceph-admin.
Jan 26 12:40:28 np0005596062 systemd[1]: Started Session 29 of User ceph-admin.
Jan 26 12:40:29 np0005596062 systemd-logind[781]: New session 30 of user ceph-admin.
Jan 26 12:40:29 np0005596062 systemd[1]: Started Session 30 of User ceph-admin.
Jan 26 12:40:29 np0005596062 systemd-logind[781]: New session 31 of user ceph-admin.
Jan 26 12:40:29 np0005596062 systemd[1]: Started Session 31 of User ceph-admin.
Jan 26 12:40:30 np0005596062 systemd-logind[781]: New session 32 of user ceph-admin.
Jan 26 12:40:30 np0005596062 systemd[1]: Started Session 32 of User ceph-admin.
Jan 26 12:40:30 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:14 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:14 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:15 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:15 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:15 np0005596062 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73622 (sysctl)
Jan 26 12:41:15 np0005596062 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 26 12:41:15 np0005596062 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 26 12:41:16 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:17 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:19 np0005596062 systemd[1]: var-lib-containers-storage-overlay-compat3734421192-lower\x2dmapped.mount: Deactivated successfully.
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.120514249 +0000 UTC m=+16.442984985 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.150453729 +0000 UTC m=+16.472924445 container create 354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_brattain, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 26 12:41:34 np0005596062 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2980918653-merged.mount: Deactivated successfully.
Jan 26 12:41:34 np0005596062 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 26 12:41:34 np0005596062 systemd[1]: Started libpod-conmon-354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9.scope.
Jan 26 12:41:34 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.295561121 +0000 UTC m=+16.618031867 container init 354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_brattain, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.302458065 +0000 UTC m=+16.624928781 container start 354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_brattain, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:41:34 np0005596062 keen_brattain[73957]: 167 167
Jan 26 12:41:34 np0005596062 systemd[1]: libpod-354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9.scope: Deactivated successfully.
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.320837496 +0000 UTC m=+16.643308232 container attach 354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_brattain, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.321548665 +0000 UTC m=+16.644019381 container died 354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_brattain, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:41:34 np0005596062 systemd[1]: var-lib-containers-storage-overlay-8a739b2e675bfe341b98bc9205ef80a2fca7a6b47d92de128c1f3f1be58d6846-merged.mount: Deactivated successfully.
Jan 26 12:41:34 np0005596062 podman[73898]: 2026-01-26 17:41:34.397884077 +0000 UTC m=+16.720354813 container remove 354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_brattain, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:41:34 np0005596062 systemd[1]: libpod-conmon-354bb378a83c2cf3b2ec92d1670b1b915154da4802817db67a4f02856a4a24b9.scope: Deactivated successfully.
Jan 26 12:41:34 np0005596062 podman[73981]: 2026-01-26 17:41:34.583640635 +0000 UTC m=+0.059090172 container create ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_colden, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Jan 26 12:41:34 np0005596062 systemd[1]: Started libpod-conmon-ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3.scope.
Jan 26 12:41:34 np0005596062 podman[73981]: 2026-01-26 17:41:34.551650149 +0000 UTC m=+0.027099696 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:34 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:41:34 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d29ffc04a9989c8962d19eb27d5baa10a51239d0524293d98c37088826cc3a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:34 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d29ffc04a9989c8962d19eb27d5baa10a51239d0524293d98c37088826cc3a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:34 np0005596062 podman[73981]: 2026-01-26 17:41:34.668544306 +0000 UTC m=+0.143993833 container init ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_colden, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 26 12:41:34 np0005596062 podman[73981]: 2026-01-26 17:41:34.675390079 +0000 UTC m=+0.150839606 container start ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_colden, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:41:34 np0005596062 podman[73981]: 2026-01-26 17:41:34.679717964 +0000 UTC m=+0.155167491 container attach ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_colden, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Jan 26 12:41:35 np0005596062 condescending_colden[73997]: [
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:    {
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "available": false,
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "ceph_device": false,
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "lsm_data": {},
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "lvs": [],
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "path": "/dev/sr0",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "rejected_reasons": [
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "Insufficient space (<5GB)",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "Has a FileSystem"
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        ],
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        "sys_api": {
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "actuators": null,
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "device_nodes": "sr0",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "devname": "sr0",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "human_readable_size": "482.00 KB",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "id_bus": "ata",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "model": "QEMU DVD-ROM",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "nr_requests": "2",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "parent": "/dev/sr0",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "partitions": {},
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "path": "/dev/sr0",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "removable": "1",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "rev": "2.5+",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "ro": "0",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "rotational": "1",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "sas_address": "",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "sas_device_handle": "",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "scheduler_mode": "mq-deadline",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "sectors": 0,
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "sectorsize": "2048",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "size": 493568.0,
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "support_discard": "2048",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "type": "disk",
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:            "vendor": "QEMU"
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:        }
Jan 26 12:41:35 np0005596062 condescending_colden[73997]:    }
Jan 26 12:41:35 np0005596062 condescending_colden[73997]: ]
Jan 26 12:41:35 np0005596062 systemd[1]: libpod-ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3.scope: Deactivated successfully.
Jan 26 12:41:35 np0005596062 systemd[1]: libpod-ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3.scope: Consumed 1.138s CPU time.
Jan 26 12:41:35 np0005596062 podman[73981]: 2026-01-26 17:41:35.804270439 +0000 UTC m=+1.279719966 container died ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_colden, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True)
Jan 26 12:41:35 np0005596062 systemd[1]: var-lib-containers-storage-overlay-b0d29ffc04a9989c8962d19eb27d5baa10a51239d0524293d98c37088826cc3a-merged.mount: Deactivated successfully.
Jan 26 12:41:35 np0005596062 podman[73981]: 2026-01-26 17:41:35.855112648 +0000 UTC m=+1.330562175 container remove ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_colden, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 26 12:41:35 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:35 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:35 np0005596062 systemd[1]: libpod-conmon-ecae9ad0416a7d8ed83c163367971fa647c74695a71d4d050aeb35b48764d3b3.scope: Deactivated successfully.
Jan 26 12:41:41 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:41 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.385115804 +0000 UTC m=+0.045781376 container create 9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_black, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:41:41 np0005596062 systemd[1]: Started libpod-conmon-9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6.scope.
Jan 26 12:41:41 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.458015963 +0000 UTC m=+0.118681495 container init 9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_black, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.369036384 +0000 UTC m=+0.029701946 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.465236377 +0000 UTC m=+0.125901919 container start 9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_black, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.468643558 +0000 UTC m=+0.129309080 container attach 9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_black, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:41:41 np0005596062 beautiful_black[76840]: 167 167
Jan 26 12:41:41 np0005596062 systemd[1]: libpod-9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6.scope: Deactivated successfully.
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.470886538 +0000 UTC m=+0.131552070 container died 9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_black, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 26 12:41:41 np0005596062 podman[76823]: 2026-01-26 17:41:41.517074253 +0000 UTC m=+0.177739785 container remove 9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=beautiful_black, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:41:41 np0005596062 systemd[1]: libpod-conmon-9f62b07a7fa5869fc0ad102ec11bceaf01081fb08f0caec62186f2c4ea8da8e6.scope: Deactivated successfully.
Jan 26 12:41:41 np0005596062 podman[76862]: 2026-01-26 17:41:41.588563405 +0000 UTC m=+0.042431326 container create bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_brahmagupta, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 26 12:41:41 np0005596062 systemd[1]: Started libpod-conmon-bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe.scope.
Jan 26 12:41:41 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:41:41 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b44142f44ecc6c408eb06fdce97e91a052fc9f6d5f7562f0966f24f14f0a407b/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:41 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b44142f44ecc6c408eb06fdce97e91a052fc9f6d5f7562f0966f24f14f0a407b/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:41 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b44142f44ecc6c408eb06fdce97e91a052fc9f6d5f7562f0966f24f14f0a407b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:41 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b44142f44ecc6c408eb06fdce97e91a052fc9f6d5f7562f0966f24f14f0a407b/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:41 np0005596062 podman[76862]: 2026-01-26 17:41:41.653320917 +0000 UTC m=+0.107188898 container init bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_brahmagupta, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 26 12:41:41 np0005596062 podman[76862]: 2026-01-26 17:41:41.658743152 +0000 UTC m=+0.112611053 container start bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_brahmagupta, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:41:41 np0005596062 podman[76862]: 2026-01-26 17:41:41.661720971 +0000 UTC m=+0.115588902 container attach bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_brahmagupta, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Jan 26 12:41:41 np0005596062 podman[76862]: 2026-01-26 17:41:41.566870295 +0000 UTC m=+0.020738226 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:41 np0005596062 systemd[1]: libpod-bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe.scope: Deactivated successfully.
Jan 26 12:41:41 np0005596062 podman[76904]: 2026-01-26 17:41:41.753983729 +0000 UTC m=+0.022718079 container died bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_brahmagupta, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 12:41:41 np0005596062 podman[76904]: 2026-01-26 17:41:41.78881035 +0000 UTC m=+0.057544690 container remove bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=practical_brahmagupta, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef)
Jan 26 12:41:41 np0005596062 systemd[1]: libpod-conmon-bc73d6ab6d1b23f46a378c39af24a532eb739b49ba66b3d4a1ddfa46016decfe.scope: Deactivated successfully.
Jan 26 12:41:41 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:41 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:41 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:42 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:42 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:42 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:42 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:42 np0005596062 systemd[1]: Reached target All Ceph clusters and services.
Jan 26 12:41:42 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:42 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:42 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:42 np0005596062 systemd[1]: Reached target Ceph cluster d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:41:42 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:42 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:42 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:42 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:42 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:43 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:43 np0005596062 systemd[1]: Created slice Slice /system/ceph-d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:41:43 np0005596062 systemd[1]: Reached target System Time Set.
Jan 26 12:41:43 np0005596062 systemd[1]: Reached target System Time Synchronized.
Jan 26 12:41:43 np0005596062 systemd[1]: Starting Ceph mon.compute-2 for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:41:43 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:43 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:43 np0005596062 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 12:41:43 np0005596062 podman[77158]: 2026-01-26 17:41:43.493035628 +0000 UTC m=+0.048229501 container create 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 26 12:41:43 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b013da9f33016842117ebac50cc45919440c9414e23a19a1ea92252af5b61ccf/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:43 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b013da9f33016842117ebac50cc45919440c9414e23a19a1ea92252af5b61ccf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:43 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b013da9f33016842117ebac50cc45919440c9414e23a19a1ea92252af5b61ccf/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:43 np0005596062 podman[77158]: 2026-01-26 17:41:43.565233639 +0000 UTC m=+0.120427542 container init 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 26 12:41:43 np0005596062 podman[77158]: 2026-01-26 17:41:43.473627779 +0000 UTC m=+0.028821682 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:43 np0005596062 podman[77158]: 2026-01-26 17:41:43.572175754 +0000 UTC m=+0.127369627 container start 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:41:43 np0005596062 bash[77158]: 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df
Jan 26 12:41:43 np0005596062 systemd[1]: Started Ceph mon.compute-2 for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: pidfile_write: ignore empty --pid-file
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: load: jerasure load: lrc 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: RocksDB version: 7.9.2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Git sha 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: DB SUMMARY
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: DB Session ID:  WVAUTHFR912YXSABJRD6
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: CURRENT file:  CURRENT
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                         Options.error_if_exists: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.create_if_missing: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                                     Options.env: 0x55d9ca355c40
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                                Options.info_log: 0x55d9cbc96fc0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                              Options.statistics: (nil)
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                               Options.use_fsync: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                              Options.db_log_dir: 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                                 Options.wal_dir: 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                    Options.write_buffer_manager: 0x55d9cbca6b40
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.unordered_write: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                               Options.row_cache: None
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                              Options.wal_filter: None
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.two_write_queues: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.wal_compression: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.atomic_flush: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.max_background_jobs: 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.max_background_compactions: -1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.max_subcompactions: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.max_total_wal_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                          Options.max_open_files: -1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:       Options.compaction_readahead_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Compression algorithms supported:
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kZSTD supported: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kXpressCompression supported: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kBZip2Compression supported: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kLZ4Compression supported: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kZlibCompression supported: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: #011kSnappyCompression supported: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:           Options.merge_operator: 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:        Options.compaction_filter: None
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d9cbc96c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d9cbc8f1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:        Options.write_buffer_size: 33554432
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:  Options.max_write_buffer_number: 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.compression: NoCompression
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.num_levels: 7
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 672fd1c3-93d2-431e-9d5a-4531180f45cc
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449303622587, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449303624282, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449303624380, "job": 1, "event": "recovery_finished"}
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d9cbcb8e00
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: DB pointer 0x55d9cbd42000
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid d4cd1917-5876-51b6-bc64-65a16199754d
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(???) e0 preinit fsid d4cd1917-5876-51b6-bc64-65a16199754d
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e18 crush map has features 3314933000852226048, adjusting msgr requires
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e18 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3706561662' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: Updating compute-2:/var/lib/ceph/d4cd1917-5876-51b6-bc64-65a16199754d/config/ceph.client.admin.keyring
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1447052594' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1447052594' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: Deploying daemon mon.compute-2 on compute-2
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1106791732' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1106791732' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e18 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).osd e18 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 12:41:43 np0005596062 ceph-mon[77178]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 26 12:41:45 np0005596062 ceph-mon[77178]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 26 12:41:45 np0005596062 ceph-mon[77178]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 26 12:41:45 np0005596062 ceph-mon[77178]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 26 12:41:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 26 12:41:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-26T17:41:41.693001Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864308,os=Linux}
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e19 e19: 2 total, 2 up, 2 in
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: Deploying daemon mon.compute-1 on compute-1
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3575996095' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-0 calling monitor election
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-2 calling monitor election
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Jan 26 12:41:48 np0005596062 ceph-mon[77178]:    application not enabled on pool 'vms'
Jan 26 12:41:48 np0005596062 ceph-mon[77178]:    application not enabled on pool 'volumes'
Jan 26 12:41:48 np0005596062 ceph-mon[77178]:    application not enabled on pool 'backups'
Jan 26 12:41:48 np0005596062 ceph-mon[77178]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cchxrf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.618415014 +0000 UTC m=+0.065386050 container create b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_pike, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:41:49 np0005596062 systemd[1]: Started libpod-conmon-b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b.scope.
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.590832887 +0000 UTC m=+0.037803963 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:49 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.717253268 +0000 UTC m=+0.164224284 container init b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_pike, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.731069967 +0000 UTC m=+0.178041003 container start b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_pike, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.7360602 +0000 UTC m=+0.183031196 container attach b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_pike, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 26 12:41:49 np0005596062 quirky_pike[77376]: 167 167
Jan 26 12:41:49 np0005596062 systemd[1]: libpod-b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b.scope: Deactivated successfully.
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.740247872 +0000 UTC m=+0.187218868 container died b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_pike, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:41:49 np0005596062 systemd[1]: var-lib-containers-storage-overlay-5992d38ffb5ed19b114c6a53219b7ec43b067d00b85c95941a5d641dacac7c18-merged.mount: Deactivated successfully.
Jan 26 12:41:49 np0005596062 podman[77359]: 2026-01-26 17:41:49.776075991 +0000 UTC m=+0.223046997 container remove b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_pike, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:41:49 np0005596062 systemd[1]: libpod-conmon-b5d71b4a35bf4a6e6caf22458844d9e36c8f8380a2f33948bfa134b121a2a27b.scope: Deactivated successfully.
Jan 26 12:41:49 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cchxrf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: Deploying daemon mgr.compute-2.cchxrf on compute-2
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3575996095' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e20 e20: 2 total, 2 up, 2 in
Jan 26 12:41:49 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:49 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 26 12:41:50 np0005596062 ceph-mon[77178]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 26 12:41:50 np0005596062 ceph-mon[77178]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 26 12:41:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:50 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:50 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:50 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:50 np0005596062 systemd[1]: Starting Ceph mgr.compute-2.cchxrf for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:41:50 np0005596062 podman[77518]: 2026-01-26 17:41:50.602459802 +0000 UTC m=+0.042678583 container create 73cb5682238b8ba779b74bc2af9ed6a8bab92cd800f48edb5cd149c8d7f8469f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 26 12:41:50 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66421684897ddf749db842dedb0c1df3b26b92ac9d765ec395bef5d27d004a4c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:50 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66421684897ddf749db842dedb0c1df3b26b92ac9d765ec395bef5d27d004a4c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:50 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66421684897ddf749db842dedb0c1df3b26b92ac9d765ec395bef5d27d004a4c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:50 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66421684897ddf749db842dedb0c1df3b26b92ac9d765ec395bef5d27d004a4c/merged/var/lib/ceph/mgr/ceph-compute-2.cchxrf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:41:50 np0005596062 podman[77518]: 2026-01-26 17:41:50.672561047 +0000 UTC m=+0.112779848 container init 73cb5682238b8ba779b74bc2af9ed6a8bab92cd800f48edb5cd149c8d7f8469f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:41:50 np0005596062 podman[77518]: 2026-01-26 17:41:50.581344367 +0000 UTC m=+0.021563158 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:50 np0005596062 podman[77518]: 2026-01-26 17:41:50.679239875 +0000 UTC m=+0.119458646 container start 73cb5682238b8ba779b74bc2af9ed6a8bab92cd800f48edb5cd149c8d7f8469f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Jan 26 12:41:50 np0005596062 bash[77518]: 73cb5682238b8ba779b74bc2af9ed6a8bab92cd800f48edb5cd149c8d7f8469f
Jan 26 12:41:50 np0005596062 systemd[1]: Started Ceph mgr.compute-2.cchxrf for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:41:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: pidfile_write: ignore empty --pid-file
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e21 e21: 2 total, 2 up, 2 in
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3490187801' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-0 calling monitor election
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-2 calling monitor election
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-1 calling monitor election
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Jan 26 12:41:55 np0005596062 ceph-mon[77178]:    application not enabled on pool 'vms'
Jan 26 12:41:55 np0005596062 ceph-mon[77178]:    application not enabled on pool 'volumes'
Jan 26 12:41:55 np0005596062 ceph-mon[77178]:    application not enabled on pool 'backups'
Jan 26 12:41:55 np0005596062 ceph-mon[77178]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.qpyzhk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'alerts'
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'balancer'
Jan 26 12:41:55 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:41:55.576+0000 7fc87cc40140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 12:41:55 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'cephadm'
Jan 26 12:41:55 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:41:55.821+0000 7fc87cc40140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3490187801' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.qpyzhk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: Deploying daemon mgr.compute-1.qpyzhk on compute-1
Jan 26 12:41:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e22 e22: 2 total, 2 up, 2 in
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e23 e23: 2 total, 2 up, 2 in
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3814547469' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 12:41:57 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'crash'
Jan 26 12:41:57 np0005596062 ceph-mgr[77538]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 12:41:57 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'dashboard'
Jan 26 12:41:57 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:41:57.994+0000 7fc87cc40140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.028178546 +0000 UTC m=+0.056496942 container create 2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_brown, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 26 12:41:58 np0005596062 systemd[1]: Started libpod-conmon-2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37.scope.
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.002558221 +0000 UTC m=+0.030876697 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:58 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.135031834 +0000 UTC m=+0.163350250 container init 2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_brown, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.147345733 +0000 UTC m=+0.175664119 container start 2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_brown, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.151208287 +0000 UTC m=+0.179526673 container attach 2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_brown, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 26 12:41:58 np0005596062 happy_brown[77730]: 167 167
Jan 26 12:41:58 np0005596062 systemd[1]: libpod-2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37.scope: Deactivated successfully.
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.155109281 +0000 UTC m=+0.183427677 container died 2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_brown, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2)
Jan 26 12:41:58 np0005596062 systemd[1]: var-lib-containers-storage-overlay-f7ddea6ec1b40abd82c4bd189bd74baa43262f1198eeb8b487d150f7dd1c8a55-merged.mount: Deactivated successfully.
Jan 26 12:41:58 np0005596062 podman[77714]: 2026-01-26 17:41:58.196469307 +0000 UTC m=+0.224787703 container remove 2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_brown, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:41:58 np0005596062 systemd[1]: libpod-conmon-2d48f9b613997d640daf7c51ed9dba4b491966aa0a5eb938b21cfee1b78e8a37.scope: Deactivated successfully.
Jan 26 12:41:58 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:58 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:58 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 26 12:41:58 np0005596062 ceph-mon[77178]: Deploying daemon crash.compute-2 on compute-2
Jan 26 12:41:58 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3814547469' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 12:41:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:41:58 np0005596062 systemd[1]: Reloading.
Jan 26 12:41:58 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:41:58 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:41:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e23 _set_new_cache_sizes cache_size:1019934774 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:41:58 np0005596062 systemd[1]: Starting Ceph crash.compute-2 for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:41:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e24 e24: 2 total, 2 up, 2 in
Jan 26 12:41:59 np0005596062 podman[77869]: 2026-01-26 17:41:59.098278065 +0000 UTC m=+0.055464794 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:41:59 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'devicehealth'
Jan 26 12:41:59 np0005596062 ceph-mgr[77538]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 12:41:59 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 12:41:59 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:41:59.706+0000 7fc87cc40140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 12:42:00 np0005596062 podman[77869]: 2026-01-26 17:42:00.045278122 +0000 UTC m=+1.002464801 container create 518fd5dcf420e0a042d907625995e92b79c22cb683b492d796f0e49d23978bc7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 26 12:42:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 12:42:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 12:42:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]:  from numpy import show_config as show_numpy_config
Jan 26 12:42:00 np0005596062 ceph-mgr[77538]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 12:42:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:00.219+0000 7fc87cc40140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 12:42:00 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'influx'
Jan 26 12:42:00 np0005596062 ceph-mgr[77538]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 12:42:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:00.466+0000 7fc87cc40140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 12:42:00 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'insights'
Jan 26 12:42:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e25 e25: 2 total, 2 up, 2 in
Jan 26 12:42:00 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4de90a68abe349ff2180a2d80e69caaa990acc2ea114a44f0b41a018623d743/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:00 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4de90a68abe349ff2180a2d80e69caaa990acc2ea114a44f0b41a018623d743/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:00 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4de90a68abe349ff2180a2d80e69caaa990acc2ea114a44f0b41a018623d743/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:00 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4de90a68abe349ff2180a2d80e69caaa990acc2ea114a44f0b41a018623d743/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:00 np0005596062 podman[77869]: 2026-01-26 17:42:00.624467431 +0000 UTC m=+1.581654110 container init 518fd5dcf420e0a042d907625995e92b79c22cb683b492d796f0e49d23978bc7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507)
Jan 26 12:42:00 np0005596062 podman[77869]: 2026-01-26 17:42:00.631252763 +0000 UTC m=+1.588439422 container start 518fd5dcf420e0a042d907625995e92b79c22cb683b492d796f0e49d23978bc7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:00 np0005596062 bash[77869]: 518fd5dcf420e0a042d907625995e92b79c22cb683b492d796f0e49d23978bc7
Jan 26 12:42:00 np0005596062 systemd[1]: Started Ceph crash.compute-2 for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:42:00 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'iostat'
Jan 26 12:42:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 26 12:42:01 np0005596062 ceph-mgr[77538]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:01.005+0000 7fc87cc40140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 12:42:01 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'k8sevents'
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.068+0000 7f301224a640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.068+0000 7f301224a640 -1 AuthRegistry(0x7f300c0675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.069+0000 7f301224a640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.069+0000 7f301224a640 -1 AuthRegistry(0x7f3012249000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.070+0000 7f300affd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.071+0000 7f300b7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.072+0000 7f300bfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: 2026-01-26T17:42:01.072+0000 7f301224a640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 26 12:42:01 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-crash-compute-2[77884]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.370885444 +0000 UTC m=+0.041370988 container create a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 12:42:01 np0005596062 systemd[1]: Started libpod-conmon-a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566.scope.
Jan 26 12:42:01 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.349032469 +0000 UTC m=+0.019517993 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.451613293 +0000 UTC m=+0.122098847 container init a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_mirzakhani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.461850746 +0000 UTC m=+0.132336260 container start a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_mirzakhani, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.466512631 +0000 UTC m=+0.136998155 container attach a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 26 12:42:01 np0005596062 systemd[1]: libpod-a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566.scope: Deactivated successfully.
Jan 26 12:42:01 np0005596062 conmon[78058]: conmon a36cd2786065ec2ecce8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566.scope/container/memory.events
Jan 26 12:42:01 np0005596062 distracted_mirzakhani[78058]: 167 167
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.473650732 +0000 UTC m=+0.144136266 container died a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_mirzakhani, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:01 np0005596062 systemd[1]: var-lib-containers-storage-overlay-73f08384cff952fe165aec5ed71d12ec21c4045ebe876d4a6c4bef8f59b94fbf-merged.mount: Deactivated successfully.
Jan 26 12:42:01 np0005596062 podman[78041]: 2026-01-26 17:42:01.513333923 +0000 UTC m=+0.183819427 container remove a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_mirzakhani, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:42:01 np0005596062 systemd[1]: libpod-conmon-a36cd2786065ec2ecce8393d20015eea3db61f9178153d1eb33824acc7e42566.scope: Deactivated successfully.
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1605417848' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:42:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e26 e26: 2 total, 2 up, 2 in
Jan 26 12:42:01 np0005596062 podman[78081]: 2026-01-26 17:42:01.6754803 +0000 UTC m=+0.042501798 container create d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_villani, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:01 np0005596062 systemd[1]: Started libpod-conmon-d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637.scope.
Jan 26 12:42:01 np0005596062 podman[78081]: 2026-01-26 17:42:01.654900629 +0000 UTC m=+0.021922077 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:01 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:01 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bff0fe518ed59516e5972e6ca07188c8b6a528e0bd4c894e185c24ede680e1c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:01 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bff0fe518ed59516e5972e6ca07188c8b6a528e0bd4c894e185c24ede680e1c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:01 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bff0fe518ed59516e5972e6ca07188c8b6a528e0bd4c894e185c24ede680e1c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:01 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bff0fe518ed59516e5972e6ca07188c8b6a528e0bd4c894e185c24ede680e1c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:01 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bff0fe518ed59516e5972e6ca07188c8b6a528e0bd4c894e185c24ede680e1c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:01 np0005596062 podman[78081]: 2026-01-26 17:42:01.773956753 +0000 UTC m=+0.140978181 container init d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_villani, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default)
Jan 26 12:42:01 np0005596062 podman[78081]: 2026-01-26 17:42:01.781797033 +0000 UTC m=+0.148818441 container start d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_villani, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 12:42:01 np0005596062 podman[78081]: 2026-01-26 17:42:01.785094071 +0000 UTC m=+0.152115579 container attach d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_villani, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:42:02 np0005596062 naughty_villani[78098]: --> passed data devices: 0 physical, 1 LVM
Jan 26 12:42:02 np0005596062 naughty_villani[78098]: --> relative data size: 1.0
Jan 26 12:42:02 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 12:42:02 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 9cf3a1cc-aed3-427e-a898-1ddf0c091222
Jan 26 12:42:02 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'localpool'
Jan 26 12:42:03 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 12:42:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222"} v 0) v1
Jan 26 12:42:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1601559491' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222"}]: dispatch
Jan 26 12:42:03 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1605417848' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 26 12:42:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e26 _set_new_cache_sizes cache_size:1020053274 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:03 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'mirroring'
Jan 26 12:42:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e27 e27: 2 total, 2 up, 2 in
Jan 26 12:42:03 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'nfs'
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.102:0/1601559491' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222"}]: dispatch
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1717325685' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222"}]: dispatch
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:04 np0005596062 lvm[78146]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 12:42:04 np0005596062 lvm[78146]: VG ceph_vg0 finished
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 26 12:42:04 np0005596062 ceph-mgr[77538]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 12:42:04 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'orchestrator'
Jan 26 12:42:04 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:04.712+0000 7fc87cc40140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Jan 26 12:42:04 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3526785209' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: stderr: got monmap epoch 3
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: --> Creating keyring file for osd.2
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 26 12:42:04 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 9cf3a1cc-aed3-427e-a898-1ddf0c091222 --setuser ceph --setgroup ceph
Jan 26 12:42:05 np0005596062 ceph-mgr[77538]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 12:42:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:05.461+0000 7fc87cc40140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 12:42:05 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 12:42:05 np0005596062 ceph-mgr[77538]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 12:42:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:05.734+0000 7fc87cc40140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 12:42:05 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'osd_support'
Jan 26 12:42:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e29 e29: 3 total, 2 up, 3 in
Jan 26 12:42:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:42:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:42:05 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1717325685' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 26 12:42:05 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222"}]': finished
Jan 26 12:42:05 np0005596062 ceph-mgr[77538]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 12:42:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:05.978+0000 7fc87cc40140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 12:42:05 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 12:42:06 np0005596062 ceph-mgr[77538]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 12:42:06 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:06.276+0000 7fc87cc40140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 12:42:06 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'progress'
Jan 26 12:42:06 np0005596062 ceph-mgr[77538]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 12:42:06 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:06.532+0000 7fc87cc40140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 12:42:06 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'prometheus'
Jan 26 12:42:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e30 e30: 3 total, 2 up, 3 in
Jan 26 12:42:06 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/814326835' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 26 12:42:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:06 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/814326835' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 26 12:42:07 np0005596062 ceph-mgr[77538]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 12:42:07 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:07.636+0000 7fc87cc40140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 12:42:07 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'rbd_support'
Jan 26 12:42:07 np0005596062 ceph-mgr[77538]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 12:42:07 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:07.952+0000 7fc87cc40140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 12:42:07 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/2848215916' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 26 12:42:07 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'restful'
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: stderr: 2026-01-26T17:42:04.970+0000 7f62389e4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: stderr: 2026-01-26T17:42:04.970+0000 7f62389e4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: stderr: 2026-01-26T17:42:04.970+0000 7f62389e4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: stderr: 2026-01-26T17:42:04.970+0000 7f62389e4740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 26 12:42:08 np0005596062 naughty_villani[78098]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 26 12:42:08 np0005596062 systemd[1]: libpod-d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637.scope: Deactivated successfully.
Jan 26 12:42:08 np0005596062 podman[78081]: 2026-01-26 17:42:08.311837823 +0000 UTC m=+6.678859251 container died d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_villani, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:08 np0005596062 systemd[1]: libpod-d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637.scope: Consumed 2.541s CPU time.
Jan 26 12:42:08 np0005596062 systemd[1]: var-lib-containers-storage-overlay-7bff0fe518ed59516e5972e6ca07188c8b6a528e0bd4c894e185c24ede680e1c-merged.mount: Deactivated successfully.
Jan 26 12:42:08 np0005596062 podman[78081]: 2026-01-26 17:42:08.366048763 +0000 UTC m=+6.733070171 container remove d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_villani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True)
Jan 26 12:42:08 np0005596062 systemd[1]: libpod-conmon-d84956d0a8469b64df7103df1cc82768e14c215d49c58dfc51cb5d607221f637.scope: Deactivated successfully.
Jan 26 12:42:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054713 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Jan 26 12:42:08 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'rgw'
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.028254343 +0000 UTC m=+0.043651669 container create 08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_greider, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 26 12:42:09 np0005596062 systemd[1]: Started libpod-conmon-08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450.scope.
Jan 26 12:42:09 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.008162586 +0000 UTC m=+0.023559942 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.111787477 +0000 UTC m=+0.127184803 container init 08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_greider, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.124643661 +0000 UTC m=+0.140040997 container start 08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.128960646 +0000 UTC m=+0.144357992 container attach 08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_greider, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 26 12:42:09 np0005596062 hungry_greider[79227]: 167 167
Jan 26 12:42:09 np0005596062 systemd[1]: libpod-08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450.scope: Deactivated successfully.
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.132171242 +0000 UTC m=+0.147568608 container died 08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_greider, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:42:09 np0005596062 systemd[1]: var-lib-containers-storage-overlay-771738919fa2a6558f35b35d458635e6d58f4ee9e590b6610dd337a364bba798-merged.mount: Deactivated successfully.
Jan 26 12:42:09 np0005596062 podman[79211]: 2026-01-26 17:42:09.181916972 +0000 UTC m=+0.197314298 container remove 08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hungry_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Jan 26 12:42:09 np0005596062 systemd[1]: libpod-conmon-08b62308e1da232a26a4adb2dd751ec2cc24381cd7fd0ea14be325eb7444a450.scope: Deactivated successfully.
Jan 26 12:42:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:09 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/2848215916' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 26 12:42:09 np0005596062 podman[79252]: 2026-01-26 17:42:09.345670282 +0000 UTC m=+0.039809856 container create 946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 26 12:42:09 np0005596062 systemd[1]: Started libpod-conmon-946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0.scope.
Jan 26 12:42:09 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:09 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e13fc9c758cfcce7fec769c0796ce6ca0d2c4934ccf4d6d3d402cc3ad7fd8c5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:09 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e13fc9c758cfcce7fec769c0796ce6ca0d2c4934ccf4d6d3d402cc3ad7fd8c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:09 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e13fc9c758cfcce7fec769c0796ce6ca0d2c4934ccf4d6d3d402cc3ad7fd8c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:09 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e13fc9c758cfcce7fec769c0796ce6ca0d2c4934ccf4d6d3d402cc3ad7fd8c5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:09 np0005596062 podman[79252]: 2026-01-26 17:42:09.328879393 +0000 UTC m=+0.023018987 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:09 np0005596062 podman[79252]: 2026-01-26 17:42:09.435394912 +0000 UTC m=+0.129534506 container init 946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:09 np0005596062 podman[79252]: 2026-01-26 17:42:09.443721234 +0000 UTC m=+0.137860828 container start 946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 12:42:09 np0005596062 ceph-mgr[77538]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 12:42:09 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'rook'
Jan 26 12:42:09 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:09.443+0000 7fc87cc40140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 12:42:09 np0005596062 podman[79252]: 2026-01-26 17:42:09.447137366 +0000 UTC m=+0.141276940 container attach 946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]: {
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:    "2": [
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:        {
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "devices": [
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "/dev/loop3"
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            ],
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "lv_name": "ceph_lv0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "lv_size": "7511998464",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=1rh8dR-G1Vw-9U8i-3Yfy-b8vr-cmKE-oI1kkv,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=d4cd1917-5876-51b6-bc64-65a16199754d,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9cf3a1cc-aed3-427e-a898-1ddf0c091222,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "lv_uuid": "1rh8dR-G1Vw-9U8i-3Yfy-b8vr-cmKE-oI1kkv",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "name": "ceph_lv0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "tags": {
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.block_uuid": "1rh8dR-G1Vw-9U8i-3Yfy-b8vr-cmKE-oI1kkv",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.cephx_lockbox_secret": "",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.cluster_fsid": "d4cd1917-5876-51b6-bc64-65a16199754d",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.cluster_name": "ceph",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.crush_device_class": "",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.encrypted": "0",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.osd_fsid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.osd_id": "2",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.type": "block",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:                "ceph.vdo": "0"
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            },
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "type": "block",
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:            "vg_name": "ceph_vg0"
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:        }
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]:    ]
Jan 26 12:42:10 np0005596062 naughty_aryabhata[79268]: }
Jan 26 12:42:10 np0005596062 systemd[1]: libpod-946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0.scope: Deactivated successfully.
Jan 26 12:42:10 np0005596062 podman[79252]: 2026-01-26 17:42:10.219962914 +0000 UTC m=+0.914102488 container died 946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef)
Jan 26 12:42:10 np0005596062 ceph-mon[77178]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 12:42:10 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/694454554' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 26 12:42:10 np0005596062 systemd[1]: var-lib-containers-storage-overlay-6e13fc9c758cfcce7fec769c0796ce6ca0d2c4934ccf4d6d3d402cc3ad7fd8c5-merged.mount: Deactivated successfully.
Jan 26 12:42:10 np0005596062 podman[79252]: 2026-01-26 17:42:10.294897628 +0000 UTC m=+0.989037242 container remove 946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_aryabhata, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 12:42:10 np0005596062 systemd[1]: libpod-conmon-946ce845a022b34adb4118d43aa570fcc28b5807f930f82ec03691e16e5398a0.scope: Deactivated successfully.
Jan 26 12:42:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Jan 26 12:42:10 np0005596062 podman[79428]: 2026-01-26 17:42:10.878489016 +0000 UTC m=+0.034397191 container create 2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 26 12:42:10 np0005596062 systemd[1]: Started libpod-conmon-2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f.scope.
Jan 26 12:42:10 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:10 np0005596062 podman[79428]: 2026-01-26 17:42:10.951303454 +0000 UTC m=+0.107211659 container init 2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_brattain, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:42:10 np0005596062 podman[79428]: 2026-01-26 17:42:10.958734212 +0000 UTC m=+0.114642397 container start 2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_brattain, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 26 12:42:10 np0005596062 podman[79428]: 2026-01-26 17:42:10.864526493 +0000 UTC m=+0.020434698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:10 np0005596062 exciting_brattain[79445]: 167 167
Jan 26 12:42:10 np0005596062 podman[79428]: 2026-01-26 17:42:10.968603406 +0000 UTC m=+0.124511701 container attach 2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_brattain, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:10 np0005596062 systemd[1]: libpod-2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f.scope: Deactivated successfully.
Jan 26 12:42:10 np0005596062 podman[79428]: 2026-01-26 17:42:10.969043758 +0000 UTC m=+0.124951943 container died 2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_brattain, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:10 np0005596062 systemd[1]: var-lib-containers-storage-overlay-a2fa1595d7565d5f6f45e4c83ee3af562f165718ff1cef23d91ceee1c2c53623-merged.mount: Deactivated successfully.
Jan 26 12:42:11 np0005596062 podman[79428]: 2026-01-26 17:42:11.002969155 +0000 UTC m=+0.158877340 container remove 2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_brattain, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:42:11 np0005596062 systemd[1]: libpod-conmon-2b19364b6b836fdf141d9f5705c9cd02eedc9549efe749b4a8665e7e89a8dd4f.scope: Deactivated successfully.
Jan 26 12:42:11 np0005596062 podman[79479]: 2026-01-26 17:42:11.242611193 +0000 UTC m=+0.046087823 container create 44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 26 12:42:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 26 12:42:11 np0005596062 ceph-mon[77178]: Deploying daemon osd.2 on compute-2
Jan 26 12:42:11 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/694454554' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 26 12:42:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:11 np0005596062 systemd[1]: Started libpod-conmon-44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb.scope.
Jan 26 12:42:11 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a245f078182058eb0cf5e3102cab7d4546091500328ffd503414bd86a9b20093/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a245f078182058eb0cf5e3102cab7d4546091500328ffd503414bd86a9b20093/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a245f078182058eb0cf5e3102cab7d4546091500328ffd503414bd86a9b20093/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a245f078182058eb0cf5e3102cab7d4546091500328ffd503414bd86a9b20093/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a245f078182058eb0cf5e3102cab7d4546091500328ffd503414bd86a9b20093/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:11 np0005596062 podman[79479]: 2026-01-26 17:42:11.320084045 +0000 UTC m=+0.123560665 container init 44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 26 12:42:11 np0005596062 podman[79479]: 2026-01-26 17:42:11.223948484 +0000 UTC m=+0.027425124 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:11 np0005596062 podman[79479]: 2026-01-26 17:42:11.330608237 +0000 UTC m=+0.134084857 container start 44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:11 np0005596062 podman[79479]: 2026-01-26 17:42:11.333613407 +0000 UTC m=+0.137090027 container attach 44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 26 12:42:11 np0005596062 ceph-mgr[77538]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 12:42:11 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'selftest'
Jan 26 12:42:11 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:11.650+0000 7fc87cc40140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 12:42:11 np0005596062 ceph-mgr[77538]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 12:42:11 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'snap_schedule'
Jan 26 12:42:11 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:11.902+0000 7fc87cc40140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 12:42:11 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test[79495]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 26 12:42:11 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test[79495]:                            [--no-systemd] [--no-tmpfs]
Jan 26 12:42:11 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test[79495]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 26 12:42:11 np0005596062 systemd[1]: libpod-44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb.scope: Deactivated successfully.
Jan 26 12:42:11 np0005596062 podman[79479]: 2026-01-26 17:42:11.986077087 +0000 UTC m=+0.789553717 container died 44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:12 np0005596062 systemd[1]: var-lib-containers-storage-overlay-a245f078182058eb0cf5e3102cab7d4546091500328ffd503414bd86a9b20093-merged.mount: Deactivated successfully.
Jan 26 12:42:12 np0005596062 podman[79479]: 2026-01-26 17:42:12.04603005 +0000 UTC m=+0.849506710 container remove 44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:12 np0005596062 systemd[1]: libpod-conmon-44549fa921ccb86651d58e853ff414aebfa2e173c709a35c3ca1d6d3c4bd9dfb.scope: Deactivated successfully.
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'stats'
Jan 26 12:42:12 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:12.187+0000 7fc87cc40140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 12:42:12 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3523141357' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 26 12:42:12 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:12 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:12 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'status'
Jan 26 12:42:12 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:12 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:12 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'telegraf'
Jan 26 12:42:12 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:12.729+0000 7fc87cc40140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 12:42:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Jan 26 12:42:12 np0005596062 systemd[1]: Starting Ceph osd.2 for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 12:42:12 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'telemetry'
Jan 26 12:42:12 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:12.962+0000 7fc87cc40140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 12:42:13 np0005596062 podman[79654]: 2026-01-26 17:42:13.105074684 +0000 UTC m=+0.049970518 container create d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:42:13 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:13 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8ef7712de9db022eb9985a3c7e35efe845b695aa75740ee12550643076efd8c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:13 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8ef7712de9db022eb9985a3c7e35efe845b695aa75740ee12550643076efd8c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:13 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8ef7712de9db022eb9985a3c7e35efe845b695aa75740ee12550643076efd8c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:13 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8ef7712de9db022eb9985a3c7e35efe845b695aa75740ee12550643076efd8c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:13 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8ef7712de9db022eb9985a3c7e35efe845b695aa75740ee12550643076efd8c/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:13 np0005596062 podman[79654]: 2026-01-26 17:42:13.081470002 +0000 UTC m=+0.026365926 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:13 np0005596062 podman[79654]: 2026-01-26 17:42:13.206751963 +0000 UTC m=+0.151647817 container init d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:13 np0005596062 podman[79654]: 2026-01-26 17:42:13.21410317 +0000 UTC m=+0.158998994 container start d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:13 np0005596062 podman[79654]: 2026-01-26 17:42:13.217144341 +0000 UTC m=+0.162040155 container attach d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:13 np0005596062 ceph-mgr[77538]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 12:42:13 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:13.656+0000 7fc87cc40140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 12:42:13 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 12:42:13 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3523141357' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 26 12:42:14 np0005596062 bash[79654]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 26 12:42:14 np0005596062 bash[79654]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 26 12:42:14 np0005596062 bash[79654]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 12:42:14 np0005596062 bash[79654]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:14 np0005596062 bash[79654]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 26 12:42:14 np0005596062 bash[79654]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate[79669]: --> ceph-volume raw activate successful for osd ID: 2
Jan 26 12:42:14 np0005596062 bash[79654]: --> ceph-volume raw activate successful for osd ID: 2
Jan 26 12:42:14 np0005596062 systemd[1]: libpod-d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a.scope: Deactivated successfully.
Jan 26 12:42:14 np0005596062 podman[79654]: 2026-01-26 17:42:14.156514834 +0000 UTC m=+1.101410668 container died d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 26 12:42:14 np0005596062 systemd[1]: var-lib-containers-storage-overlay-c8ef7712de9db022eb9985a3c7e35efe845b695aa75740ee12550643076efd8c-merged.mount: Deactivated successfully.
Jan 26 12:42:14 np0005596062 podman[79654]: 2026-01-26 17:42:14.206326166 +0000 UTC m=+1.151222000 container remove d879c17ddafb59c42159c3107213172ba8a4833129c925a2c0a9b45e7586f73a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 26 12:42:14 np0005596062 ceph-mgr[77538]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 12:42:14 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:14.424+0000 7fc87cc40140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 12:42:14 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'volumes'
Jan 26 12:42:14 np0005596062 podman[79846]: 2026-01-26 17:42:14.469602737 +0000 UTC m=+0.071539984 container create 7a199a14029f3cbd8a9ad6d6970372dbfc60bf767f3d69462d15510402e27a17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:42:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a5729d7e51b01dc022438364b2cfe685e983f8e962f9a1cbb59fa4578d7c11/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a5729d7e51b01dc022438364b2cfe685e983f8e962f9a1cbb59fa4578d7c11/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a5729d7e51b01dc022438364b2cfe685e983f8e962f9a1cbb59fa4578d7c11/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a5729d7e51b01dc022438364b2cfe685e983f8e962f9a1cbb59fa4578d7c11/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:14 np0005596062 podman[79846]: 2026-01-26 17:42:14.438547347 +0000 UTC m=+0.040484664 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a5729d7e51b01dc022438364b2cfe685e983f8e962f9a1cbb59fa4578d7c11/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:14 np0005596062 podman[79846]: 2026-01-26 17:42:14.540187025 +0000 UTC m=+0.142124302 container init 7a199a14029f3cbd8a9ad6d6970372dbfc60bf767f3d69462d15510402e27a17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 26 12:42:14 np0005596062 podman[79846]: 2026-01-26 17:42:14.555625038 +0000 UTC m=+0.157562285 container start 7a199a14029f3cbd8a9ad6d6970372dbfc60bf767f3d69462d15510402e27a17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 26 12:42:14 np0005596062 bash[79846]: 7a199a14029f3cbd8a9ad6d6970372dbfc60bf767f3d69462d15510402e27a17
Jan 26 12:42:14 np0005596062 systemd[1]: Started Ceph osd.2 for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: pidfile_write: ignore empty --pid-file
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647a98c1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647a98c1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647a98c1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647a98c1c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647aa6cd000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647aa6cd000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647aa6cd000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647aa6cd000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647aa6cd000 /var/lib/ceph/osd/ceph-2/block) close
Jan 26 12:42:14 np0005596062 ceph-osd[79865]: bdev(0x5647a98c1c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 26 12:42:14 np0005596062 ceph-mon[77178]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 26 12:42:14 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 12:42:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: load: jerasure load: lrc 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 26 12:42:15 np0005596062 ceph-mgr[77538]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 12:42:15 np0005596062 ceph-mgr[77538]: mgr[py] Loading python module 'zabbix'
Jan 26 12:42:15 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:15.171+0000 7fc87cc40140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.34209312 +0000 UTC m=+0.055804723 container create d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 26 12:42:15 np0005596062 systemd[1]: Started libpod-conmon-d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4.scope.
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.31926414 +0000 UTC m=+0.032975823 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:15 np0005596062 ceph-mgr[77538]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 12:42:15 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mgr-compute-2-cchxrf[77534]: 2026-01-26T17:42:15.414+0000 7fc87cc40140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 12:42:15 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:15 np0005596062 ceph-mgr[77538]: ms_deliver_dispatch: unhandled message 0x559d0e80f080 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Jan 26 12:42:15 np0005596062 ceph-mgr[77538]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2716354406
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.437653176 +0000 UTC m=+0.151364809 container init d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.449333118 +0000 UTC m=+0.163044721 container start d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507)
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.452531234 +0000 UTC m=+0.166242847 container attach d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 12:42:15 np0005596062 upbeat_goldberg[80039]: 167 167
Jan 26 12:42:15 np0005596062 systemd[1]: libpod-d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4.scope: Deactivated successfully.
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.45611464 +0000 UTC m=+0.169826253 container died d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 26 12:42:15 np0005596062 systemd[1]: var-lib-containers-storage-overlay-4355179eb13485b686e4d7d8b8297712cc90acc30ef65725284cb1e825221830-merged.mount: Deactivated successfully.
Jan 26 12:42:15 np0005596062 podman[80023]: 2026-01-26 17:42:15.495098822 +0000 UTC m=+0.208810415 container remove d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_goldberg, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3)
Jan 26 12:42:15 np0005596062 systemd[1]: libpod-conmon-d8e1ded4a828dd73d9f4d874cc84991a3f83a3454227c9c7a18becd8693dafe4.scope: Deactivated successfully.
Jan 26 12:42:15 np0005596062 podman[80066]: 2026-01-26 17:42:15.644244041 +0000 UTC m=+0.043505914 container create 28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_torvalds, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 12:42:15 np0005596062 systemd[1]: Started libpod-conmon-28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162.scope.
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa750c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs mount
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs mount shared_bdev_used = 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 26 12:42:15 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: RocksDB version: 7.9.2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Git sha 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: DB SUMMARY
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: DB Session ID:  L2B72T7BUK4WWEF0XWSU
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: CURRENT file:  CURRENT
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.error_if_exists: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.create_if_missing: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                     Options.env: 0x5647aa753dc0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                Options.info_log: 0x5647a993eba0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                              Options.statistics: (nil)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.use_fsync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 12:42:15 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0a523ef91834095a2bc74fa357425abf009c4a1df1eebc2803b59138cd526fa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:15 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0a523ef91834095a2bc74fa357425abf009c4a1df1eebc2803b59138cd526fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:15 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0a523ef91834095a2bc74fa357425abf009c4a1df1eebc2803b59138cd526fa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 12:42:15 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0a523ef91834095a2bc74fa357425abf009c4a1df1eebc2803b59138cd526fa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                              Options.db_log_dir: 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                 Options.wal_dir: db.wal
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.write_buffer_manager: 0x5647aa852460
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.unordered_write: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.row_cache: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                              Options.wal_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.two_write_queues: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.wal_compression: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.atomic_flush: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_background_jobs: 4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_background_compactions: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_subcompactions: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.max_open_files: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Compression algorithms supported:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kZSTD supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kXpressCompression supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kBZip2Compression supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kLZ4Compression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kZlibCompression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kSnappyCompression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 podman[80066]: 2026-01-26 17:42:15.715850166 +0000 UTC m=+0.115112079 container init 28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_torvalds, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 podman[80066]: 2026-01-26 17:42:15.627489463 +0000 UTC m=+0.026751356 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 podman[80066]: 2026-01-26 17:42:15.724263871 +0000 UTC m=+0.123525754 container start 28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_torvalds, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:15 np0005596062 podman[80066]: 2026-01-26 17:42:15.728123844 +0000 UTC m=+0.127385717 container attach 28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_torvalds, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993e5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9934430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ea286723-8b66-4583-b0aa-bf572a0e47eb
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449335721292, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449335721467, "job": 1, "event": "recovery_finished"}
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: freelist init
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: freelist _read_cfg
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs umount
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) close
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bdev(0x5647aa751400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs mount
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluefs mount shared_bdev_used = 4718592
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: RocksDB version: 7.9.2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Git sha 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: DB SUMMARY
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: DB Session ID:  L2B72T7BUK4WWEF0XWSV
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: CURRENT file:  CURRENT
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.error_if_exists: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.create_if_missing: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                     Options.env: 0x5647a9a8c3f0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                Options.info_log: 0x5647a993f860
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                              Options.statistics: (nil)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.use_fsync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                              Options.db_log_dir: 
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                                 Options.wal_dir: db.wal
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.write_buffer_manager: 0x5647aa852460
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.unordered_write: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.row_cache: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                              Options.wal_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.two_write_queues: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.wal_compression: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.atomic_flush: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_background_jobs: 4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_background_compactions: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_subcompactions: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.max_open_files: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Compression algorithms supported:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kZSTD supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kXpressCompression supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kBZip2Compression supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kLZ4Compression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kZlibCompression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: #011kSnappyCompression supported: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a991bb60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a9935350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993fe20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a99354b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993fe20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a99354b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:           Options.merge_operator: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5647a993fe20)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5647a99354b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.compression: LZ4
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.num_levels: 7
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 12:42:15 np0005596062 ceph-osd[79865]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                           Options.bloom_locality: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                               Options.ttl: 2592000
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                       Options.enable_blob_files: false
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                           Options.min_blob_size: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ea286723-8b66-4583-b0aa-bf572a0e47eb
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449335993790, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449335997480, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449335, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ea286723-8b66-4583-b0aa-bf572a0e47eb", "db_session_id": "L2B72T7BUK4WWEF0XWSV", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449336002184, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449335, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ea286723-8b66-4583-b0aa-bf572a0e47eb", "db_session_id": "L2B72T7BUK4WWEF0XWSV", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449336006241, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449336, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ea286723-8b66-4583-b0aa-bf572a0e47eb", "db_session_id": "L2B72T7BUK4WWEF0XWSV", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449336008258, "job": 1, "event": "recovery_finished"}
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5647a99f3c00
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: DB pointer 0x5647aa83da00
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5647a9935350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5647a9935350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: _get_class not permitted to load lua
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: _get_class not permitted to load sdk
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: _get_class not permitted to load test_remote_reads
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 load_pgs
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 load_pgs opened 0 pgs
Jan 26 12:42:16 np0005596062 ceph-osd[79865]: osd.2 0 log_to_monitors true
Jan 26 12:42:16 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2[79861]: 2026-01-26T17:42:16.038+0000 7fbfd456a740 -1 osd.2 0 log_to_monitors true
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/815499186,v1:192.168.122.102:6801/815499186]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]: {
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:    "9cf3a1cc-aed3-427e-a898-1ddf0c091222": {
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:        "ceph_fsid": "d4cd1917-5876-51b6-bc64-65a16199754d",
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:        "osd_id": 2,
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:        "osd_uuid": "9cf3a1cc-aed3-427e-a898-1ddf0c091222",
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:        "type": "bluestore"
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]:    }
Jan 26 12:42:16 np0005596062 nervous_torvalds[80082]: }
Jan 26 12:42:16 np0005596062 systemd[1]: libpod-28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162.scope: Deactivated successfully.
Jan 26 12:42:16 np0005596062 podman[80066]: 2026-01-26 17:42:16.639834658 +0000 UTC m=+1.039096531 container died 28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_torvalds, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:16 np0005596062 systemd[1]: var-lib-containers-storage-overlay-a0a523ef91834095a2bc74fa357425abf009c4a1df1eebc2803b59138cd526fa-merged.mount: Deactivated successfully.
Jan 26 12:42:16 np0005596062 podman[80066]: 2026-01-26 17:42:16.695028994 +0000 UTC m=+1.094290877 container remove 28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_torvalds, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Jan 26 12:42:16 np0005596062 systemd[1]: libpod-conmon-28859c4c8c2bd1694de0909e747746cc81240bb94cb05627adf9f1ed3ea16162.scope: Deactivated successfully.
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e34 e34: 3 total, 2 up, 3 in
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/815499186,v1:192.168.122.102:6801/815499186]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: from='osd.2 [v2:192.168.122.102:6800/815499186,v1:192.168.122.102:6801/815499186]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/2401748601' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/2401748601' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 26 12:42:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e35 e35: 3 total, 2 up, 3 in
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0 done with init, starting boot process
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0 start_boot
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 26 12:42:17 np0005596062 ceph-osd[79865]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 26 12:42:17 np0005596062 ceph-mon[77178]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 26 12:42:17 np0005596062 ceph-mon[77178]: from='osd.2 [v2:192.168.122.102:6800/815499186,v1:192.168.122.102:6801/815499186]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 12:42:17 np0005596062 ceph-mon[77178]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 12:42:17 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/3160412710' entity='client.admin' 
Jan 26 12:42:17 np0005596062 ceph-mon[77178]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 26 12:42:18 np0005596062 podman[80745]: 2026-01-26 17:42:18.102023042 +0000 UTC m=+0.074768190 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 26 12:42:18 np0005596062 podman[80745]: 2026-01-26 17:42:18.484378587 +0000 UTC m=+0.457123735 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:42:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:20 np0005596062 ceph-mon[77178]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 26 12:42:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:20 np0005596062 ceph-mon[77178]: Saving service ingress.rgw.default spec with placement count:2
Jan 26 12:42:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.647661953 +0000 UTC m=+0.044487961 container create 893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bouman, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 26 12:42:20 np0005596062 systemd[1]: Started libpod-conmon-893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb.scope.
Jan 26 12:42:20 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.630257528 +0000 UTC m=+0.027083556 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.728857865 +0000 UTC m=+0.125683913 container init 893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bouman, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.735483352 +0000 UTC m=+0.132309390 container start 893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bouman, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 26 12:42:20 np0005596062 elegant_bouman[81111]: 167 167
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.740530147 +0000 UTC m=+0.137356375 container attach 893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bouman, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:42:20 np0005596062 systemd[1]: libpod-893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb.scope: Deactivated successfully.
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.74102527 +0000 UTC m=+0.137851278 container died 893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bouman, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 31.715 iops: 8119.114 elapsed_sec: 0.369
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: log_channel(cluster) log [WRN] : OSD bench result of 8119.114292 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 0 waiting for initial osdmap
Jan 26 12:42:20 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2[79861]: 2026-01-26T17:42:20.750+0000 7fbfd0d01640 -1 osd.2 0 waiting for initial osdmap
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 check_osdmap_features require_osd_release unknown -> reef
Jan 26 12:42:20 np0005596062 systemd[1]: var-lib-containers-storage-overlay-eb17a111991c38a203884d6344af7af3ed8666d892a63a06bfecde632e2f4365-merged.mount: Deactivated successfully.
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 set_numa_affinity not setting numa affinity
Jan 26 12:42:20 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-osd-2[79861]: 2026-01-26T17:42:20.781+0000 7fbfcbb12640 -1 osd.2 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 26 12:42:20 np0005596062 podman[81094]: 2026-01-26 17:42:20.783795054 +0000 UTC m=+0.180621062 container remove 893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_bouman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 26 12:42:20 np0005596062 ceph-osd[79865]: osd.2 35 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 26 12:42:20 np0005596062 systemd[1]: libpod-conmon-893e41bfaad405ea25d5343c9f2bde661a8aec9c53fd9b8c7417b2e280e614bb.scope: Deactivated successfully.
Jan 26 12:42:20 np0005596062 podman[81135]: 2026-01-26 17:42:20.94891859 +0000 UTC m=+0.035015077 container create 727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_stonebraker, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:42:21 np0005596062 systemd[1]: Started libpod-conmon-727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251.scope.
Jan 26 12:42:21 np0005596062 podman[81135]: 2026-01-26 17:42:20.934792512 +0000 UTC m=+0.020888999 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:21 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:21 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9438fcb25b468b509ca320892dbdda35cf3a9cbb138a1091efd1897c6655392/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:21 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9438fcb25b468b509ca320892dbdda35cf3a9cbb138a1091efd1897c6655392/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:21 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9438fcb25b468b509ca320892dbdda35cf3a9cbb138a1091efd1897c6655392/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:21 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9438fcb25b468b509ca320892dbdda35cf3a9cbb138a1091efd1897c6655392/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:21 np0005596062 podman[81135]: 2026-01-26 17:42:21.04724563 +0000 UTC m=+0.133342127 container init 727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_stonebraker, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 26 12:42:21 np0005596062 podman[81135]: 2026-01-26 17:42:21.055949212 +0000 UTC m=+0.142045699 container start 727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_stonebraker, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3)
Jan 26 12:42:21 np0005596062 podman[81135]: 2026-01-26 17:42:21.059088586 +0000 UTC m=+0.145185073 container attach 727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_stonebraker, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 36 state: booting -> active
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.15( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.14( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.10( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.c( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.d( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.13( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.a( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-mon[77178]: OSD bench result of 8119.114292 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.2( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.3( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.6( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.8( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.1d( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.1b( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.1c( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.1b( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.19( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.1a( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.1f( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.15( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.12( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.15( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.e( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.11( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.f( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.9( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.8( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.18( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[4.1( empty local-lis/les=0/0 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.5( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.b( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.1c( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.1d( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[3.9( empty local-lis/les=0/0 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 36 pg[2.1d( empty local-lis/les=0/0 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]: [
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:    {
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "available": false,
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "ceph_device": false,
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "lsm_data": {},
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "lvs": [],
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "path": "/dev/sr0",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "rejected_reasons": [
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "Has a FileSystem",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "Insufficient space (<5GB)"
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        ],
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        "sys_api": {
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "actuators": null,
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "device_nodes": "sr0",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "devname": "sr0",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "human_readable_size": "482.00 KB",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "id_bus": "ata",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "model": "QEMU DVD-ROM",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "nr_requests": "2",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "parent": "/dev/sr0",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "partitions": {},
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "path": "/dev/sr0",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "removable": "1",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "rev": "2.5+",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "ro": "0",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "rotational": "1",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "sas_address": "",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "sas_device_handle": "",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "scheduler_mode": "mq-deadline",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "sectors": 0,
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "sectorsize": "2048",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "size": 493568.0,
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "support_discard": "2048",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "type": "disk",
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:            "vendor": "QEMU"
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:        }
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]:    }
Jan 26 12:42:22 np0005596062 distracted_stonebraker[81151]: ]
Jan 26 12:42:22 np0005596062 systemd[1]: libpod-727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251.scope: Deactivated successfully.
Jan 26 12:42:22 np0005596062 systemd[1]: libpod-727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251.scope: Consumed 1.207s CPU time.
Jan 26 12:42:22 np0005596062 podman[81135]: 2026-01-26 17:42:22.247548501 +0000 UTC m=+1.333644988 container died 727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_stonebraker, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:42:22 np0005596062 ceph-mon[77178]: osd.2 [v2:192.168.122.102:6800/815499186,v1:192.168.122.102:6801/815499186] boot
Jan 26 12:42:22 np0005596062 systemd[1]: var-lib-containers-storage-overlay-f9438fcb25b468b509ca320892dbdda35cf3a9cbb138a1091efd1897c6655392-merged.mount: Deactivated successfully.
Jan 26 12:42:22 np0005596062 podman[81135]: 2026-01-26 17:42:22.496494189 +0000 UTC m=+1.582590676 container remove 727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=distracted_stonebraker, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:42:22 np0005596062 systemd[1]: libpod-conmon-727901ad15f637f415189a7da5717959d41f5046c88e8678d1417f0dac391251.scope: Deactivated successfully.
Jan 26 12:42:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.1b( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.1c( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.19( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.15( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.14( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.1d( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.10( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.0( empty local-lis/les=36/37 n=0 ec=16/16 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[5.0( empty local-lis/les=36/37 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.d( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.a( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.c( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.6( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.13( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.2( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.9( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.3( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=23/23 les/c/f=24/24/0 sis=36) [2] r=0 lpr=36 pi=[23,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.1b( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.1c( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.1d( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.b( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.8( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.5( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.1d( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.1( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.12( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.11( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.1a( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.15( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[3.e( empty local-lis/les=36/37 n=0 ec=21/16 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.15( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.f( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.9( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.1f( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[2.18( empty local-lis/les=36/37 n=0 ec=21/14 lis/c=21/21 les/c/f=22/22/0 sis=36) [2] r=0 lpr=36 pi=[21,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 37 pg[4.8( empty local-lis/les=36/37 n=0 ec=23/17 lis/c=27/27 les/c/f=29/29/0 sis=36) [2] r=0 lpr=36 pi=[27,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e2 new map
Jan 26 12:42:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:22.558341+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 26 12:42:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 26 12:42:23 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Jan 26 12:42:23 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:42:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:24 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.d scrub starts
Jan 26 12:42:24 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.d scrub ok
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: Unable to set osd_memory_target on compute-2 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 12:42:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:25 np0005596062 ceph-mon[77178]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 12:42:25 np0005596062 ceph-mon[77178]: Updating compute-0:/var/lib/ceph/d4cd1917-5876-51b6-bc64-65a16199754d/config/ceph.conf
Jan 26 12:42:25 np0005596062 ceph-mon[77178]: Updating compute-2:/var/lib/ceph/d4cd1917-5876-51b6-bc64-65a16199754d/config/ceph.conf
Jan 26 12:42:25 np0005596062 ceph-mon[77178]: Updating compute-1:/var/lib/ceph/d4cd1917-5876-51b6-bc64-65a16199754d/config/ceph.conf
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:42:27 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.c scrub starts
Jan 26 12:42:27 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.c scrub ok
Jan 26 12:42:28 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/2224340308' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 26 12:42:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:29 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/2224340308' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 26 12:42:30 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Jan 26 12:42:30 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Jan 26 12:42:31 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Jan 26 12:42:31 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Jan 26 12:42:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:37 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Jan 26 12:42:37 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.66890108 +0000 UTC m=+0.066644718 container create faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_euclid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Jan 26 12:42:38 np0005596062 systemd[72598]: Starting Mark boot as successful...
Jan 26 12:42:38 np0005596062 systemd[72598]: Finished Mark boot as successful.
Jan 26 12:42:38 np0005596062 systemd[1]: Started libpod-conmon-faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291.scope.
Jan 26 12:42:38 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.644827421 +0000 UTC m=+0.042571099 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.753719079 +0000 UTC m=+0.151462757 container init faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_euclid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.768715812 +0000 UTC m=+0.166459420 container start faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_euclid, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.772754572 +0000 UTC m=+0.170498210 container attach faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_euclid, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 26 12:42:38 np0005596062 cool_euclid[83125]: 167 167
Jan 26 12:42:38 np0005596062 systemd[1]: libpod-faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291.scope: Deactivated successfully.
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.777380787 +0000 UTC m=+0.175124425 container died faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_euclid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True)
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.vncnzm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.vncnzm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: Deploying daemon rgw.rgw.compute-2.vncnzm on compute-2
Jan 26 12:42:38 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1867336082' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 26 12:42:38 np0005596062 systemd[1]: var-lib-containers-storage-overlay-d35f539284c4d8089e1425360a2806daf0f7567e1088e818555229cd51db571a-merged.mount: Deactivated successfully.
Jan 26 12:42:38 np0005596062 podman[83108]: 2026-01-26 17:42:38.826025307 +0000 UTC m=+0.223768955 container remove faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:42:38 np0005596062 systemd[1]: libpod-conmon-faa2acc0eec601996f66ffb1502cb0cbe8c9659b644ad71f7b830d510cb44291.scope: Deactivated successfully.
Jan 26 12:42:38 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:39 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:39 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:39 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:39 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:39 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:39 np0005596062 systemd[1]: Starting Ceph rgw.rgw.compute-2.vncnzm for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:42:39 np0005596062 podman[83270]: 2026-01-26 17:42:39.878559176 +0000 UTC m=+0.060680300 container create 587ad199a52736530409524328feef99453e226f537b627dd2ce21517a911893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-rgw-rgw-compute-2-vncnzm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 26 12:42:39 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cb160e70bb6dd8b820c7b49cf8a5802b0027c07fde9fdebab47de338f7ffe3c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:39 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cb160e70bb6dd8b820c7b49cf8a5802b0027c07fde9fdebab47de338f7ffe3c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:39 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cb160e70bb6dd8b820c7b49cf8a5802b0027c07fde9fdebab47de338f7ffe3c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:39 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cb160e70bb6dd8b820c7b49cf8a5802b0027c07fde9fdebab47de338f7ffe3c/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.vncnzm supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:39 np0005596062 podman[83270]: 2026-01-26 17:42:39.851078562 +0000 UTC m=+0.033199786 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:39 np0005596062 podman[83270]: 2026-01-26 17:42:39.973731122 +0000 UTC m=+0.155852286 container init 587ad199a52736530409524328feef99453e226f537b627dd2ce21517a911893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-rgw-rgw-compute-2-vncnzm, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 26 12:42:39 np0005596062 podman[83270]: 2026-01-26 17:42:39.986256243 +0000 UTC m=+0.168377367 container start 587ad199a52736530409524328feef99453e226f537b627dd2ce21517a911893 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-rgw-rgw-compute-2-vncnzm, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:42:39 np0005596062 bash[83270]: 587ad199a52736530409524328feef99453e226f537b627dd2ce21517a911893
Jan 26 12:42:39 np0005596062 systemd[1]: Started Ceph rgw.rgw.compute-2.vncnzm for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:42:40 np0005596062 radosgw[83289]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 26 12:42:40 np0005596062 radosgw[83289]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 26 12:42:40 np0005596062 radosgw[83289]: framework: beast
Jan 26 12:42:40 np0005596062 radosgw[83289]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 26 12:42:40 np0005596062 radosgw[83289]: init_numa not setting numa affinity
Jan 26 12:42:40 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Jan 26 12:42:40 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.dudysi", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.dudysi", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: Deploying daemon rgw.rgw.compute-1.dudysi on compute-1
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Jan 26 12:42:41 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/882037292' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 12:42:42 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.102:0/882037292' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 12:42:42 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 12:42:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zjkivk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zjkivk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: Deploying daemon rgw.rgw.compute-0.zjkivk on compute-0
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/882037292' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 12:42:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 26 12:42:44 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.102:0/882037292' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 12:42:44 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 12:42:44 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.101:0/942420072' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 12:42:44 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 12:42:45 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Jan 26 12:42:45 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/882037292' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.oqvedy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 12:42:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.oqvedy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.722515378 +0000 UTC m=+0.039736099 container create 187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_hypatia, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:42:45 np0005596062 systemd[1]: Started libpod-conmon-187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e.scope.
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.701935976 +0000 UTC m=+0.019156707 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:45 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.824612196 +0000 UTC m=+0.141832927 container init 187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_hypatia, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.834640116 +0000 UTC m=+0.151860847 container start 187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_hypatia, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.838504612 +0000 UTC m=+0.155725343 container attach 187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_hypatia, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:42:45 np0005596062 ecstatic_hypatia[83508]: 167 167
Jan 26 12:42:45 np0005596062 systemd[1]: libpod-187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e.scope: Deactivated successfully.
Jan 26 12:42:45 np0005596062 conmon[83508]: conmon 187dcfb055998cded316 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e.scope/container/memory.events
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.842598704 +0000 UTC m=+0.159819435 container died 187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_hypatia, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:42:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay-97ea15dbdf0367ba6ca1062c00d4677730a4b896ede2c5aa00e7b05bd0bd0afd-merged.mount: Deactivated successfully.
Jan 26 12:42:45 np0005596062 podman[83492]: 2026-01-26 17:42:45.891019758 +0000 UTC m=+0.208240449 container remove 187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_hypatia, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 12:42:45 np0005596062 systemd[1]: libpod-conmon-187dcfb055998cded316eb13bfdbcc697d764b358965bcb36a396d8a5be9e77e.scope: Deactivated successfully.
Jan 26 12:42:45 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:46 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:46 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:46 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:46 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:46 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 26 12:42:46 np0005596062 systemd[1]: Starting Ceph mds.cephfs.compute-2.oqvedy for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: Deploying daemon mds.cephfs.compute-2.oqvedy on compute-2
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.102:0/882037292' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.101:0/942420072' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/814761124' entity='client.rgw.rgw.compute-0.zjkivk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 12:42:46 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/814761124' entity='client.rgw.rgw.compute-0.zjkivk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 12:42:46 np0005596062 podman[83651]: 2026-01-26 17:42:46.824432235 +0000 UTC m=+0.047172343 container create 938f23de213090a8b589e2de238212ae59f3e6c6126c2975f6a610a07eddb422 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mds-cephfs-compute-2-oqvedy, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 26 12:42:46 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f49b6f0c54c8a8e143a777faea452f424a6803224292c6d42792a832e76bbea8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:46 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f49b6f0c54c8a8e143a777faea452f424a6803224292c6d42792a832e76bbea8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:46 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f49b6f0c54c8a8e143a777faea452f424a6803224292c6d42792a832e76bbea8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:46 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f49b6f0c54c8a8e143a777faea452f424a6803224292c6d42792a832e76bbea8/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.oqvedy supports timestamps until 2038 (0x7fffffff)
Jan 26 12:42:46 np0005596062 podman[83651]: 2026-01-26 17:42:46.904764862 +0000 UTC m=+0.127504990 container init 938f23de213090a8b589e2de238212ae59f3e6c6126c2975f6a610a07eddb422 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mds-cephfs-compute-2-oqvedy, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 26 12:42:46 np0005596062 podman[83651]: 2026-01-26 17:42:46.808669614 +0000 UTC m=+0.031409732 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:42:46 np0005596062 podman[83651]: 2026-01-26 17:42:46.918591606 +0000 UTC m=+0.141331704 container start 938f23de213090a8b589e2de238212ae59f3e6c6126c2975f6a610a07eddb422 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mds-cephfs-compute-2-oqvedy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True)
Jan 26 12:42:46 np0005596062 bash[83651]: 938f23de213090a8b589e2de238212ae59f3e6c6126c2975f6a610a07eddb422
Jan 26 12:42:46 np0005596062 systemd[1]: Started Ceph mds.cephfs.compute-2.oqvedy for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:42:46 np0005596062 ceph-mds[83671]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 12:42:46 np0005596062 ceph-mds[83671]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 26 12:42:46 np0005596062 ceph-mds[83671]: main not setting numa affinity
Jan 26 12:42:46 np0005596062 ceph-mds[83671]: pidfile_write: ignore empty --pid-file
Jan 26 12:42:46 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mds-cephfs-compute-2-oqvedy[83667]: starting mds.cephfs.compute-2.oqvedy at 
Jan 26 12:42:46 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy Updating MDS map to version 2 from mon.1
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.wenkwv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.wenkwv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e3 new map
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:22.558341+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.oqvedy{-1:24157} state up:standby seq 1 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy Updating MDS map to version 3 from mon.1
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy Monitors have assigned me to become a standby.
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e4 new map
Jan 26 12:42:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:47.614358+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:creating seq 1 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy Updating MDS map to version 4 from mon.1
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x1
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x100
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x600
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x601
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x602
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x603
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x604
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x605
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x606
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x607
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x608
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.cache creating system inode with ino:0x609
Jan 26 12:42:47 np0005596062 ceph-mds[83671]: mds.0.4 creating_done
Jan 26 12:42:48 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 26 12:42:48 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1450920383' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: Deploying daemon mds.cephfs.compute-0.wenkwv on compute-0
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: daemon mds.cephfs.compute-2.oqvedy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: daemon mds.cephfs.compute-2.oqvedy is now active in filesystem cephfs as rank 0
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1497594370' entity='client.rgw.rgw.compute-0.zjkivk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.101:0/3727664983' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.102:0/1450920383' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e5 new map
Jan 26 12:42:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:48.628872+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 26 12:42:48 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy Updating MDS map to version 5 from mon.1
Jan 26 12:42:48 np0005596062 ceph-mds[83671]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 26 12:42:48 np0005596062 ceph-mds[83671]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 26 12:42:48 np0005596062 ceph-mds[83671]: mds.0.4 recovery_done -- successful recovery!
Jan 26 12:42:48 np0005596062 ceph-mds[83671]: mds.0.4 active_start
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1450920383' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.oxxatt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.oxxatt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: Deploying daemon mds.cephfs.compute-1.oxxatt on compute-1
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1497594370' entity='client.rgw.rgw.compute-0.zjkivk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1497594370' entity='client.rgw.rgw.compute-0.zjkivk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.101:0/3727664983' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.102:0/1450920383' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e6 new map
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:48.628872+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.wenkwv{-1:14373} state up:standby seq 1 addr [v2:192.168.122.100:6806/1188189847,v1:192.168.122.100:6807/1188189847] compat {c=[1],r=[1],i=[7ff]}]
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e7 new map
Jan 26 12:42:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:48.628872+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:active seq 2 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.wenkwv{-1:14373} state up:standby seq 1 addr [v2:192.168.122.100:6806/1188189847,v1:192.168.122.100:6807/1188189847] compat {c=[1],r=[1],i=[7ff]}]
Jan 26 12:42:50 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Jan 26 12:42:50 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Jan 26 12:42:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 26 12:42:51 np0005596062 radosgw[83289]: LDAP not started since no server URIs were provided in the configuration.
Jan 26 12:42:51 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-rgw-rgw-compute-2-vncnzm[83285]: 2026-01-26T17:42:51.567+0000 7fa2b2561940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 26 12:42:51 np0005596062 radosgw[83289]: framework: beast
Jan 26 12:42:51 np0005596062 radosgw[83289]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 26 12:42:51 np0005596062 radosgw[83289]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 26 12:42:51 np0005596062 radosgw[83289]: starting handler: beast
Jan 26 12:42:51 np0005596062 radosgw[83289]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 12:42:51 np0005596062 radosgw[83289]: mgrc service_daemon_register rgw.24172 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.vncnzm,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864308,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=16904eab-39d2-4595-a23c-00ad5300f474,zone_name=default,zonegroup_id=7e31b727-958b-45d2-9b71-ae42e4dae024,zonegroup_name=default}
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='client.? 192.168.122.100:0/1497594370' entity='client.rgw.rgw.compute-0.zjkivk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-2.vncnzm' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='client.? ' entity='client.rgw.rgw.compute-1.dudysi' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: Deploying daemon haproxy.rgw.default.compute-0.wyazzh on compute-0
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e8 new map
Jan 26 12:42:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:52.412531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.wenkwv{-1:14373} state up:standby seq 1 addr [v2:192.168.122.100:6806/1188189847,v1:192.168.122.100:6807/1188189847] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.oxxatt{-1:24149} state up:standby seq 1 addr [v2:192.168.122.101:6804/245886810,v1:192.168.122.101:6805/245886810] compat {c=[1],r=[1],i=[7ff]}]
Jan 26 12:42:52 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy Updating MDS map to version 8 from mon.1
Jan 26 12:42:53 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Jan 26 12:42:53 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Jan 26 12:42:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e9 new map
Jan 26 12:42:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:52.412531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.wenkwv{-1:14373} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/1188189847,v1:192.168.122.100:6807/1188189847] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.oxxatt{-1:24149} state up:standby seq 1 addr [v2:192.168.122.101:6804/245886810,v1:192.168.122.101:6805/245886810] compat {c=[1],r=[1],i=[7ff]}]
Jan 26 12:42:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:55 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Jan 26 12:42:55 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Jan 26 12:42:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:42:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e10 new map
Jan 26 12:42:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).mds e10 print_map#012e10#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T17:42:22.558304+0000#012modified#0112026-01-26T17:42:52.412531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24157}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.oqvedy{0:24157} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/565974780,v1:192.168.122.102:6805/565974780] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.wenkwv{-1:14373} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/1188189847,v1:192.168.122.100:6807/1188189847] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.oxxatt{-1:24149} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/245886810,v1:192.168.122.101:6805/245886810] compat {c=[1],r=[1],i=[7ff]}]
Jan 26 12:42:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 26 12:42:56 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Jan 26 12:42:56 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:42:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 26 12:42:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:42:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000050s ======
Jan 26 12:42:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:42:57.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Jan 26 12:42:57 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Jan 26 12:42:57 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Jan 26 12:42:57 np0005596062 ceph-mon[77178]: Deploying daemon haproxy.rgw.default.compute-2.dyvhne on compute-2
Jan 26 12:42:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Jan 26 12:42:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:42:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Jan 26 12:42:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:42:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 26 12:42:57 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 51 pg[5.0( empty local-lis/les=36/37 n=0 ec=19/19 lis/c=36/36 les/c/f=37/37/0 sis=51 pruub=12.734590530s) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active pruub 54.601913452s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:42:57 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 51 pg[5.0( empty local-lis/les=36/37 n=0 ec=19/19 lis/c=36/36 les/c/f=37/37/0 sis=51 pruub=12.734590530s) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown pruub 54.601913452s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:42:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:42:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Jan 26 12:42:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:42:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:42:58 np0005596062 podman[84397]: 2026-01-26 17:42:58.829162783 +0000 UTC m=+2.153182446 container create 2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b (image=quay.io/ceph/haproxy:2.3, name=quirky_darwin)
Jan 26 12:42:58 np0005596062 systemd[1]: Started libpod-conmon-2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b.scope.
Jan 26 12:42:58 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:42:58 np0005596062 podman[84397]: 2026-01-26 17:42:58.813058473 +0000 UTC m=+2.137078186 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 26 12:42:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:42:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:42:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:42:59.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:42:59 np0005596062 podman[84397]: 2026-01-26 17:42:59.10774395 +0000 UTC m=+2.431763643 container init 2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b (image=quay.io/ceph/haproxy:2.3, name=quirky_darwin)
Jan 26 12:42:59 np0005596062 podman[84397]: 2026-01-26 17:42:59.115741109 +0000 UTC m=+2.439760772 container start 2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b (image=quay.io/ceph/haproxy:2.3, name=quirky_darwin)
Jan 26 12:42:59 np0005596062 podman[84397]: 2026-01-26 17:42:59.118787454 +0000 UTC m=+2.442807157 container attach 2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b (image=quay.io/ceph/haproxy:2.3, name=quirky_darwin)
Jan 26 12:42:59 np0005596062 quirky_darwin[84513]: 0 0
Jan 26 12:42:59 np0005596062 systemd[1]: libpod-2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b.scope: Deactivated successfully.
Jan 26 12:42:59 np0005596062 podman[84397]: 2026-01-26 17:42:59.123680666 +0000 UTC m=+2.447700339 container died 2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b (image=quay.io/ceph/haproxy:2.3, name=quirky_darwin)
Jan 26 12:42:59 np0005596062 systemd[1]: var-lib-containers-storage-overlay-ae96de96283bd794de84ffadf92fbf068f1198785ce96d8d7bd7d9dbf9a836a5-merged.mount: Deactivated successfully.
Jan 26 12:42:59 np0005596062 podman[84397]: 2026-01-26 17:42:59.169251379 +0000 UTC m=+2.493271052 container remove 2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b (image=quay.io/ceph/haproxy:2.3, name=quirky_darwin)
Jan 26 12:42:59 np0005596062 systemd[1]: libpod-conmon-2de63763f0e3c64f730fe2f2ca974cb8dc7aa3289d8e636e88092dbd2010285b.scope: Deactivated successfully.
Jan 26 12:42:59 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:59 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:59 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.19( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1f( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.8( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.13( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.9( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.18( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1e( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1d( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1c( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.15( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.12( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.6( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.17( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.a( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.7( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.14( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.d( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.3( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.b( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.e( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.2( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.c( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1b( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.f( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1a( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.4( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.5( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.10( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.11( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.16( empty local-lis/les=36/37 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.19( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.13( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.18( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1e( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1d( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.9( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1f( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.12( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1c( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.15( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.0( empty local-lis/les=51/52 n=0 ec=19/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.8( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.17( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.a( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.7( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.b( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.6( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.14( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.d( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.2( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.e( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.c( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.f( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1b( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.1a( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.5( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.4( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.3( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.10( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.11( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 52 pg[5.16( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=36/36 les/c/f=37/37/0 sis=51) [2] r=0 lpr=51 pi=[36,51)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:42:59 np0005596062 systemd[1]: Reloading.
Jan 26 12:42:59 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:42:59 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:42:59 np0005596062 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.dyvhne for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:42:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:42:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:42:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:42:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:43:00 np0005596062 podman[84656]: 2026-01-26 17:43:00.033167569 +0000 UTC m=+0.063940601 container create 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:00 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e528f3b6b83bed932a37e60ab0ff8dba2927043b0b94e5efce3fdf323e5f0748/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 26 12:43:00 np0005596062 podman[84656]: 2026-01-26 17:43:00.08466953 +0000 UTC m=+0.115442552 container init 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:00 np0005596062 podman[84656]: 2026-01-26 17:43:00.093235473 +0000 UTC m=+0.124008475 container start 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:00 np0005596062 bash[84656]: 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92
Jan 26 12:43:00 np0005596062 podman[84656]: 2026-01-26 17:43:00.005950353 +0000 UTC m=+0.036723455 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 26 12:43:00 np0005596062 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.dyvhne for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:43:00 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne[84671]: [NOTICE] 025/174300 (2) : New worker #1 (4) forked
Jan 26 12:43:00 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Jan 26 12:43:00 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Jan 26 12:43:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 26 12:43:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:01.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:43:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:01.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: Deploying daemon keepalived.rgw.default.compute-2.alfrff on compute-2
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:43:02 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 12:43:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:03.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 26 12:43:03 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Jan 26 12:43:03 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Jan 26 12:43:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:03.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:43:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.733111546 +0000 UTC m=+3.249250578 container create e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b (image=quay.io/ceph/keepalived:2.2.4, name=angry_elbakyan, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vendor=Red Hat, Inc., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 12:43:04 np0005596062 systemd[1]: Started libpod-conmon-e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b.scope.
Jan 26 12:43:04 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.715822957 +0000 UTC m=+3.231962009 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.824060737 +0000 UTC m=+3.340199799 container init e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b (image=quay.io/ceph/keepalived:2.2.4, name=angry_elbakyan, release=1793, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=)
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.831742818 +0000 UTC m=+3.347881870 container start e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b (image=quay.io/ceph/keepalived:2.2.4, name=angry_elbakyan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, architecture=x86_64, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.835786908 +0000 UTC m=+3.351925980 container attach e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b (image=quay.io/ceph/keepalived:2.2.4, name=angry_elbakyan, description=keepalived for Ceph, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, release=1793, vcs-type=git)
Jan 26 12:43:04 np0005596062 angry_elbakyan[84922]: 0 0
Jan 26 12:43:04 np0005596062 systemd[1]: libpod-e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b.scope: Deactivated successfully.
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.83988737 +0000 UTC m=+3.356026402 container died e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b (image=quay.io/ceph/keepalived:2.2.4, name=angry_elbakyan, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived)
Jan 26 12:43:04 np0005596062 systemd[1]: var-lib-containers-storage-overlay-4eb1897af55fcea72632df4286105236632908e5c308e0cace2bb78d7aabff06-merged.mount: Deactivated successfully.
Jan 26 12:43:04 np0005596062 podman[84824]: 2026-01-26 17:43:04.877977698 +0000 UTC m=+3.394116720 container remove e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b (image=quay.io/ceph/keepalived:2.2.4, name=angry_elbakyan, io.openshift.expose-services=, com.redhat.component=keepalived-container, vcs-type=git, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=keepalived)
Jan 26 12:43:04 np0005596062 systemd[1]: libpod-conmon-e86e49a96d65662af1f751e944027d1dab8331a7e5065d0124db1f977ce1653b.scope: Deactivated successfully.
Jan 26 12:43:04 np0005596062 systemd[1]: Reloading.
Jan 26 12:43:05 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:43:05 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:43:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:05.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:05 np0005596062 systemd[1]: Reloading.
Jan 26 12:43:05 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:43:05 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:43:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:05.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:05 np0005596062 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.alfrff for d4cd1917-5876-51b6-bc64-65a16199754d...
Jan 26 12:43:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 26 12:43:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:05 np0005596062 podman[85068]: 2026-01-26 17:43:05.774038507 +0000 UTC m=+0.041994595 container create 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=2.2.4, release=1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, vendor=Red Hat, Inc.)
Jan 26 12:43:05 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf173aef81293a8184a8199d06cc1386edb9bcbdc70928537b6a8a175edf9ee/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 12:43:05 np0005596062 podman[85068]: 2026-01-26 17:43:05.827067335 +0000 UTC m=+0.095023443 container init 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, com.redhat.component=keepalived-container, architecture=x86_64, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git)
Jan 26 12:43:05 np0005596062 podman[85068]: 2026-01-26 17:43:05.832820638 +0000 UTC m=+0.100776726 container start 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.expose-services=, name=keepalived, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.28.2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 12:43:05 np0005596062 bash[85068]: 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198
Jan 26 12:43:05 np0005596062 podman[85068]: 2026-01-26 17:43:05.755814834 +0000 UTC m=+0.023770952 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 26 12:43:05 np0005596062 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.alfrff for d4cd1917-5876-51b6-bc64-65a16199754d.
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: Starting VRRP child process, pid=4
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: Startup complete
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: (VI_0) Entering BACKUP STATE (init)
Jan 26 12:43:05 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:05 2026: VRRP_Script(check_backend) succeeded
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 26 12:43:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:07.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:07 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 26 12:43:07 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: Deploying daemon keepalived.rgw.default.compute-0.erukyj on compute-0
Jan 26 12:43:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 12:43:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:07.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 12:43:08 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Jan 26 12:43:08 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Jan 26 12:43:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:09.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:09.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:43:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.7( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.9( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.a( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.3( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.1( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.f( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.d( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.e( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.15( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.16( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.17( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.3( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.c( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.19( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044697762s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.500106812s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.19( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044668198s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.500106812s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.9( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047889709s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503524780s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.9( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047843933s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503524780s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1f( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047959328s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503776550s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1d( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047728539s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503746033s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.18( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047408104s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503440857s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1c( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047724724s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503746033s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1d( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047689438s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503746033s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1c( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047640800s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503746033s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1f( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047838211s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503776550s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.5( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.18( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.047148705s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503440857s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.a( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1e( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046399117s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503479004s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.15( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046679497s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503768921s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.15( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046636581s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503768921s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.17( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046653748s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503814697s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1e( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046342850s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503479004s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.17( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046626091s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503814697s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.6( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046430588s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503807068s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.6( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.046371460s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503807068s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.b( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.a( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045774460s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503822327s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.8( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.7( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045685768s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503822327s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.7( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045556068s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503822327s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.a( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045639992s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503822327s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.14( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045254707s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503852844s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.14( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045212746s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503852844s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.3( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045389175s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.504051208s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.3( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.045338631s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.504051208s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.3( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.9( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.2( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044663429s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.504058838s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.2( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044583321s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.504058838s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1b( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044340134s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503982544s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.c( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044259071s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503890991s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.19( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1b( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044310570s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503982544s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.c( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044205666s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503890991s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.f( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.044017792s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503974915s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.f( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043983459s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503974915s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043970108s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.503990173s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.1( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043946266s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.503990173s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.5( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043892860s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.504028320s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.f( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.5( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043835640s) [1] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.504028320s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.1c( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.11( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043248177s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.504096985s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.16( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043190956s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.504104614s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.11( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043180466s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.504096985s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.16( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.043117523s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.504104614s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.6( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.10( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.042740822s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 active pruub 67.504051208s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[6.b( empty local-lis/les=0/0 n=0 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[5.10( empty local-lis/les=51/52 n=0 ec=51/19 lis/c=51/51 les/c/f=52/52/0 sis=59 pruub=14.042712212s) [0] r=-1 lpr=59 pi=[51,59)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 67.504051208s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.2( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.5( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[11.13( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.16( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.11( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[8.1f( empty local-lis/les=0/0 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.10( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.11( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.16( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.1d( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.1( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.a( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.12( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.1e( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.11( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.14( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.1f( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.4( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[7.5( empty local-lis/les=0/0 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.f( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 59 pg[10.3( empty local-lis/les=0/0 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:09 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:09 2026: (VI_0) Entering MASTER STATE
Jan 26 12:43:10 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Jan 26 12:43:10 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Jan 26 12:43:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:11.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.e scrub starts
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.e scrub ok
Jan 26 12:43:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:11.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.14( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.1( v 45'48 (0'0,45'48] local-lis/les=59/60 n=1 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.5( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.f( v 45'48 (0'0,45'48] local-lis/les=59/60 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.1e( v 45'48 (0'0,45'48] local-lis/les=59/60 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.4( v 45'48 (0'0,45'48] local-lis/les=59/60 n=1 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.1f( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.12( v 45'48 (0'0,45'48] local-lis/les=59/60 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.11( v 45'48 (0'0,45'48] local-lis/les=59/60 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.11( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.16( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.a( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.17( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.3( v 58'51 lc 45'38 (0'0,58'51] local-lis/les=59/60 n=1 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=58'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[7.1d( empty local-lis/les=59/60 n=0 ec=52/23 lis/c=52/52 les/c/f=53/53/0 sis=59) [2] r=0 lpr=59 pi=[52,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.5( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[10.10( v 45'48 (0'0,45'48] local-lis/les=59/60 n=0 ec=55/43 lis/c=55/55 les/c/f=56/56/0 sis=59) [2] r=0 lpr=59 pi=[55,59)/1 crt=45'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.b( v 48'39 lc 0'0 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.15( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.13( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.16( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.8( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.11( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.b( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.1f( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.19( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.d( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.e( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.3( v 48'39 lc 0'0 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.a( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.9( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.6( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.1( v 48'39 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.3( v 40'4 (0'0,40'4] local-lis/les=59/60 n=1 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.3( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.f( v 40'4 lc 0'0 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.d( v 48'39 lc 45'13 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.f( v 48'39 lc 45'1 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.2( v 40'4 (0'0,40'4] local-lis/les=59/60 n=1 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.5( v 48'39 lc 45'11 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.16( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[11.a( empty local-lis/les=59/60 n=0 ec=57/46 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.9( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.c( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[8.1c( v 40'4 (0'0,40'4] local-lis/les=59/60 n=0 ec=54/39 lis/c=54/54 les/c/f=55/55/0 sis=59) [2] r=0 lpr=59 pi=[54,59)/1 crt=40'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 60 pg[6.7( v 48'39 lc 45'21 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=51/51 les/c/f=52/52/0 sis=59) [2] r=0 lpr=59 pi=[51,59)/1 crt=48'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:43:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:43:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 26 12:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 26 12:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 26 12:43:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:13.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:13 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.9 deep-scrub starts
Jan 26 12:43:13 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.9 deep-scrub ok
Jan 26 12:43:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:14 np0005596062 podman[85369]: 2026-01-26 17:43:14.344396546 +0000 UTC m=+0.102351726 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 12:43:14 np0005596062 podman[85369]: 2026-01-26 17:43:14.471445925 +0000 UTC m=+0.229401095 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 12:43:14 np0005596062 ceph-mon[77178]: Health check failed: Degraded data redundancy: 8/213 objects degraded (3.756%), 4 pgs degraded (PG_DEGRADED)
Jan 26 12:43:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:15.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:15 np0005596062 podman[85525]: 2026-01-26 17:43:15.382574839 +0000 UTC m=+0.227252771 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:15 np0005596062 podman[85525]: 2026-01-26 17:43:15.400111205 +0000 UTC m=+0.244789017 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:15.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:15 np0005596062 podman[85589]: 2026-01-26 17:43:15.629463387 +0000 UTC m=+0.063796188 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, version=2.2.4, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, release=1793)
Jan 26 12:43:15 np0005596062 podman[85589]: 2026-01-26 17:43:15.643009563 +0000 UTC m=+0.077342364 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1793, vcs-type=git, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived)
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:43:16 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:16 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 26 12:43:16 np0005596062 ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff[85084]: Mon Jan 26 17:43:16 2026: (VI_0) Entering BACKUP STATE
Jan 26 12:43:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:17.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:17.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:18 np0005596062 systemd[1]: session-19.scope: Deactivated successfully.
Jan 26 12:43:18 np0005596062 systemd-logind[781]: Session 19 logged out. Waiting for processes to exit.
Jan 26 12:43:18 np0005596062 systemd[1]: session-19.scope: Consumed 9.802s CPU time.
Jan 26 12:43:18 np0005596062 systemd-logind[781]: Removed session 19.
Jan 26 12:43:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:19.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 26 12:43:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 26 12:43:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.b( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=8.282852173s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 48'39 active pruub 71.459884644s@ mbc={255={}}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.b( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=8.282770157s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 0'0 unknown NOTIFY pruub 71.459884644s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.3( v 48'39 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=8.282043457s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 48'39 active pruub 71.460052490s@ mbc={255={}}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.3( v 48'39 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=8.281881332s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 0'0 unknown NOTIFY pruub 71.460052490s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.f( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/61/0 sis=63 pruub=8.281723976s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 48'39 active pruub 71.460174561s@ mbc={255={}}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.7( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/61/0 sis=63 pruub=8.281692505s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 48'39 active pruub 71.460304260s@ mbc={255={}}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.f( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/61/0 sis=63 pruub=8.281529427s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 0'0 unknown NOTIFY pruub 71.460174561s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 63 pg[6.7( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/61/0 sis=63 pruub=8.281597137s) [0] r=-1 lpr=63 pi=[59,63)/1 crt=48'39 mlcod 0'0 unknown NOTIFY pruub 71.460304260s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 26 12:43:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 26 12:43:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:19.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:20 np0005596062 ceph-mon[77178]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 8/213 objects degraded (3.756%), 4 pgs degraded)
Jan 26 12:43:20 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 12:43:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 26 12:43:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 26 12:43:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.b( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.1b( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.f( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.3( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.7( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.17( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.1f( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:20 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 64 pg[9.13( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=63) [2] r=0 lpr=64 pi=[55,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:21.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.13( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.1f( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.1f( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.7( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.7( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.1b( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.b( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.13( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.b( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.f( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.1b( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.17( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.17( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.f( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.3( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 65 pg[9.3( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=65) [2]/[1] r=-1 lpr=65 pi=[55,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 26 12:43:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Jan 26 12:43:21 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Jan 26 12:43:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:21.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 26 12:43:22 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 26 12:43:22 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 26 12:43:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:23.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.13( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.13( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.b( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.7( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.b( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.7( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.17( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.17( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.3( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 67 pg[9.3( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Jan 26 12:43:23 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Jan 26 12:43:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:23.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.17( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.b( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.3( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.7( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 68 pg[9.13( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=65/55 les/c/f=66/56/0 sis=67) [2] r=0 lpr=67 pi=[55,67)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: Reconfiguring mgr.compute-0.mbryrf (monmap changed)...
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.mbryrf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: Reconfiguring daemon mgr.compute-0.mbryrf on compute-0
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 12:43:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:25.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:25.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:25 np0005596062 ceph-mon[77178]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 26 12:43:25 np0005596062 ceph-mon[77178]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 26 12:43:26 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Jan 26 12:43:26 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Jan 26 12:43:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:26 np0005596062 ceph-mon[77178]: Reconfiguring osd.1 (monmap changed)...
Jan 26 12:43:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 26 12:43:26 np0005596062 ceph-mon[77178]: Reconfiguring daemon osd.1 on compute-0
Jan 26 12:43:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:27.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:27.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 26 12:43:28 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Jan 26 12:43:28 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: Reconfiguring osd.0 (monmap changed)...
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: Reconfiguring daemon osd.0 on compute-1
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 26 12:43:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 12:43:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:29.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:29.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=69) [2] r=0 lpr=69 pi=[55,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[9.d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=69) [2] r=0 lpr=69 pi=[55,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[9.5( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=69) [2] r=0 lpr=69 pi=[55,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[9.15( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=69) [2] r=0 lpr=69 pi=[55,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[6.5( v 48'39 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=59/59 les/c/f=60/61/0 sis=69 pruub=13.485807419s) [0] r=-1 lpr=69 pi=[59,69)/1 crt=48'39 mlcod 48'39 active pruub 87.460601807s@ mbc={255={}}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[6.5( v 48'39 (0'0,48'39] local-lis/les=59/60 n=2 ec=51/21 lis/c=59/59 les/c/f=60/61/0 sis=69 pruub=13.485709190s) [0] r=-1 lpr=69 pi=[59,69)/1 crt=48'39 mlcod 0'0 unknown NOTIFY pruub 87.460601807s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[6.d( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=69 pruub=13.484887123s) [0] r=-1 lpr=69 pi=[59,69)/1 crt=48'39 mlcod 48'39 active pruub 87.460624695s@ mbc={255={}}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 69 pg[6.d( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=69 pruub=13.484621048s) [0] r=-1 lpr=69 pi=[59,69)/1 crt=48'39 mlcod 0'0 unknown NOTIFY pruub 87.460624695s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:30 np0005596062 ceph-mon[77178]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 26 12:43:30 np0005596062 ceph-mon[77178]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 26 12:43:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 12:43:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.15( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.15( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.5( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 70 pg[9.5( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.542401902 +0000 UTC m=+0.047724568 container create 6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_poincare, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:43:30 np0005596062 systemd[1]: Started libpod-conmon-6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025.scope.
Jan 26 12:43:30 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.522837055 +0000 UTC m=+0.028159741 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.627843776 +0000 UTC m=+0.133166462 container init 6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_poincare, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.63926759 +0000 UTC m=+0.144590256 container start 6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_poincare, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.643133746 +0000 UTC m=+0.148456432 container attach 6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_poincare, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:43:30 np0005596062 gracious_poincare[85811]: 167 167
Jan 26 12:43:30 np0005596062 systemd[1]: libpod-6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025.scope: Deactivated successfully.
Jan 26 12:43:30 np0005596062 conmon[85811]: conmon 6daf2bba35c2e614f568 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025.scope/container/memory.events
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.645901095 +0000 UTC m=+0.151223771 container died 6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_poincare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 26 12:43:30 np0005596062 systemd[1]: var-lib-containers-storage-overlay-c81c5a2fe0fca0d085aa75d9eb8a4daf79e03c1ac53031264e39a6d79057a487-merged.mount: Deactivated successfully.
Jan 26 12:43:30 np0005596062 podman[85795]: 2026-01-26 17:43:30.69113845 +0000 UTC m=+0.196461106 container remove 6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_poincare, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 12:43:30 np0005596062 systemd[1]: libpod-conmon-6daf2bba35c2e614f56811b2fa036030eed38fdfbe95a0c0bf55be4b2a42f025.scope: Deactivated successfully.
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 26 12:43:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.cchxrf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 12:43:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:31.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:31.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.572217227 +0000 UTC m=+0.044459747 container create b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_ramanujan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 12:43:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 26 12:43:31 np0005596062 systemd[1]: Started libpod-conmon-b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb.scope.
Jan 26 12:43:31 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.548847796 +0000 UTC m=+0.021090296 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.655464287 +0000 UTC m=+0.127706777 container init b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_ramanujan, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.662238115 +0000 UTC m=+0.134480635 container start b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_ramanujan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.666415589 +0000 UTC m=+0.138658089 container attach b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_ramanujan, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Jan 26 12:43:31 np0005596062 jolly_ramanujan[85963]: 167 167
Jan 26 12:43:31 np0005596062 systemd[1]: libpod-b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb.scope: Deactivated successfully.
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.668747337 +0000 UTC m=+0.140989857 container died b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_ramanujan, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 26 12:43:31 np0005596062 systemd[1]: var-lib-containers-storage-overlay-0b1b5ed0ca5de96208e7ac2d02b75ec9056d0ef64f992d02bb1f507ee5743c11-merged.mount: Deactivated successfully.
Jan 26 12:43:31 np0005596062 podman[85947]: 2026-01-26 17:43:31.710552666 +0000 UTC m=+0.182795156 container remove b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=jolly_ramanujan, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 12:43:31 np0005596062 systemd[1]: libpod-conmon-b8012fe42023909ad2e2116fdb4c1b921ec2fad02499187d6bb81ceb6c405bcb.scope: Deactivated successfully.
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.5( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 72 pg[9.5( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: Reconfiguring mgr.compute-2.cchxrf (monmap changed)...
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: Reconfiguring daemon mgr.compute-2.cchxrf on compute-2
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:32 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:32 np0005596062 podman[86153]: 2026-01-26 17:43:32.755825416 +0000 UTC m=+0.061941752 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:43:32 np0005596062 podman[86153]: 2026-01-26 17:43:32.865082692 +0000 UTC m=+0.171198978 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 12:43:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:33.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:33.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:33 np0005596062 podman[86354]: 2026-01-26 17:43:33.653654908 +0000 UTC m=+0.083038936 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:33 np0005596062 podman[86354]: 2026-01-26 17:43:33.668076767 +0000 UTC m=+0.097460795 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:43:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 26 12:43:33 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 73 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:33 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 73 pg[9.5( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=6 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:33 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 73 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:33 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 73 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=6 ec=55/41 lis/c=70/55 les/c/f=71/56/0 sis=72) [2] r=0 lpr=72 pi=[55,72)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e73 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:33 np0005596062 podman[86419]: 2026-01-26 17:43:33.953158395 +0000 UTC m=+0.062231789 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=Ceph keepalived, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.openshift.expose-services=)
Jan 26 12:43:33 np0005596062 podman[86419]: 2026-01-26 17:43:33.968680881 +0000 UTC m=+0.077754275 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, description=keepalived for Ceph, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., release=1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, architecture=x86_64, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 12:43:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 26 12:43:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:35.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:35 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 26 12:43:35 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 26 12:43:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:35.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:43:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:43:35 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 26 12:43:36 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Jan 26 12:43:36 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Jan 26 12:43:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:37.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:37.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 12:43:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:39.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 12:43:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 26 12:43:39 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 26 12:43:39 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 26 12:43:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:39.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:40 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Jan 26 12:43:40 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Jan 26 12:43:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 26 12:43:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 26 12:43:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:41.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 26 12:43:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 26 12:43:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 26 12:43:41 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 77 pg[9.8( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=77) [2] r=0 lpr=77 pi=[55,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:41 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 77 pg[9.18( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=77) [2] r=0 lpr=77 pi=[55,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:41.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 26 12:43:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 26 12:43:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 26 12:43:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:42 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 78 pg[9.8( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=78) [2]/[1] r=-1 lpr=78 pi=[55,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:42 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 78 pg[9.8( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=78) [2]/[1] r=-1 lpr=78 pi=[55,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:42 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 78 pg[9.18( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=78) [2]/[1] r=-1 lpr=78 pi=[55,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:42 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 78 pg[9.18( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=78) [2]/[1] r=-1 lpr=78 pi=[55,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:43.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:43 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Jan 26 12:43:43 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Jan 26 12:43:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:43.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:43:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:44 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Jan 26 12:43:44 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Jan 26 12:43:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 26 12:43:45 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.b scrub starts
Jan 26 12:43:45 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.b scrub ok
Jan 26 12:43:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:45.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 26 12:43:45 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 80 pg[9.18( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=78/55 les/c/f=79/56/0 sis=80) [2] r=0 lpr=80 pi=[55,80)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:45 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 80 pg[9.18( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=78/55 les/c/f=79/56/0 sis=80) [2] r=0 lpr=80 pi=[55,80)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:45 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 80 pg[9.8( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=78/55 les/c/f=79/56/0 sis=80) [2] r=0 lpr=80 pi=[55,80)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:45 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 80 pg[9.8( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=78/55 les/c/f=79/56/0 sis=80) [2] r=0 lpr=80 pi=[55,80)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:45.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 26 12:43:46 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 81 pg[9.18( v 48'1155 (0'0,48'1155] local-lis/les=80/81 n=5 ec=55/41 lis/c=78/55 les/c/f=79/56/0 sis=80) [2] r=0 lpr=80 pi=[55,80)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:46 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 81 pg[9.8( v 48'1155 (0'0,48'1155] local-lis/les=80/81 n=6 ec=55/41 lis/c=78/55 les/c/f=79/56/0 sis=80) [2] r=0 lpr=80 pi=[55,80)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 12:43:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:47.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 12:43:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:43:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:47.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:43:48 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.d deep-scrub starts
Jan 26 12:43:48 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.d deep-scrub ok
Jan 26 12:43:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:43:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:49.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:43:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 26 12:43:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 26 12:43:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 26 12:43:49 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 82 pg[9.19( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=82) [2] r=0 lpr=82 pi=[55,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:49 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 82 pg[9.9( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=82) [2] r=0 lpr=82 pi=[55,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:49 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 82 pg[6.9( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=82 pruub=9.852943420s) [1] r=-1 lpr=82 pi=[59,82)/1 crt=48'39 lcod 0'0 mlcod 0'0 active pruub 103.461631775s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:49 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 82 pg[6.9( v 48'39 (0'0,48'39] local-lis/les=59/60 n=1 ec=51/21 lis/c=59/59 les/c/f=60/60/0 sis=82 pruub=9.852532387s) [1] r=-1 lpr=82 pi=[59,82)/1 crt=48'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 103.461631775s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 26 12:43:50 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 83 pg[9.9( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=83) [2]/[1] r=-1 lpr=83 pi=[55,83)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:50 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 83 pg[9.9( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=83) [2]/[1] r=-1 lpr=83 pi=[55,83)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:50 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 83 pg[9.19( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=83) [2]/[1] r=-1 lpr=83 pi=[55,83)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:50 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 83 pg[9.19( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=55/55 les/c/f=56/56/0 sis=83) [2]/[1] r=-1 lpr=83 pi=[55,83)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 26 12:43:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 26 12:43:51 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 26 12:43:51 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 26 12:43:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:43:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:51.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:43:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 26 12:43:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:51.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 26 12:43:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 26 12:43:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 26 12:43:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 26 12:43:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 26 12:43:52 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 85 pg[9.9( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=83/55 les/c/f=84/56/0 sis=85) [2] r=0 lpr=85 pi=[55,85)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:52 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 85 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=83/55 les/c/f=84/56/0 sis=85) [2] r=0 lpr=85 pi=[55,85)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:52 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 85 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=83/55 les/c/f=84/56/0 sis=85) [2] r=0 lpr=85 pi=[55,85)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:52 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 85 pg[9.9( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=6 ec=55/41 lis/c=83/55 les/c/f=84/56/0 sis=85) [2] r=0 lpr=85 pi=[55,85)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 26 12:43:53 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 86 pg[9.9( v 48'1155 (0'0,48'1155] local-lis/les=85/86 n=6 ec=55/41 lis/c=83/55 les/c/f=84/56/0 sis=85) [2] r=0 lpr=85 pi=[55,85)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:53 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 86 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=85/86 n=5 ec=55/41 lis/c=83/55 les/c/f=84/56/0 sis=85) [2] r=0 lpr=85 pi=[55,85)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:43:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:53.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:43:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:53 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 26 12:43:53 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 26 12:43:53 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 26 12:43:53 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 26 12:43:54 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Jan 26 12:43:54 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Jan 26 12:43:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 26 12:43:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:55.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 26 12:43:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:43:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:55.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:43:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 26 12:43:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 26 12:43:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 26 12:43:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 26 12:43:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:57.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 26 12:43:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 26 12:43:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 26 12:43:57 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 89 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=6 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=89 pruub=8.238732338s) [0] r=-1 lpr=89 pi=[72,89)/1 crt=48'1155 mlcod 0'0 active pruub 109.658767700s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:57 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 89 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=89 pruub=8.238554001s) [0] r=-1 lpr=89 pi=[72,89)/1 crt=48'1155 mlcod 0'0 active pruub 109.658668518s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:57 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 89 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=6 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=89 pruub=8.238653183s) [0] r=-1 lpr=89 pi=[72,89)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 109.658767700s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:57 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 89 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=89 pruub=8.238485336s) [0] r=-1 lpr=89 pi=[72,89)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 109.658668518s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:43:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:57.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 26 12:43:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 26 12:43:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 26 12:43:58 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 90 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=6 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=90) [0]/[2] r=0 lpr=90 pi=[72,90)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:58 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 90 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=90) [0]/[2] r=0 lpr=90 pi=[72,90)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:43:58 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 90 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=90) [0]/[2] r=0 lpr=90 pi=[72,90)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:58 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 90 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=6 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=90) [0]/[2] r=0 lpr=90 pi=[72,90)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:43:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:43:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:43:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:43:59.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:43:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:43:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:43:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:43:59.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:43:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 26 12:43:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 91 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=90/91 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[72,90)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:59 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 91 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=90/91 n=6 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[72,90)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:43:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 26 12:43:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 26 12:44:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 26 12:44:00 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 92 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=90/91 n=5 ec=55/41 lis/c=90/72 les/c/f=91/73/0 sis=92 pruub=15.214726448s) [0] async=[0] r=-1 lpr=92 pi=[72,92)/1 crt=48'1155 mlcod 48'1155 active pruub 119.563766479s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:00 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 92 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=90/91 n=5 ec=55/41 lis/c=90/72 les/c/f=91/73/0 sis=92 pruub=15.214542389s) [0] r=-1 lpr=92 pi=[72,92)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 119.563766479s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:00 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 92 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=90/91 n=6 ec=55/41 lis/c=90/72 les/c/f=91/73/0 sis=92 pruub=15.216083527s) [0] async=[0] r=-1 lpr=92 pi=[72,92)/1 crt=48'1155 mlcod 48'1155 active pruub 119.565742493s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:00 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 92 pg[9.d( v 48'1155 (0'0,48'1155] local-lis/les=90/91 n=6 ec=55/41 lis/c=90/72 les/c/f=91/73/0 sis=92 pruub=15.215649605s) [0] r=-1 lpr=92 pi=[72,92)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 119.565742493s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 26 12:44:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 26 12:44:01 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Jan 26 12:44:01 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Jan 26 12:44:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:01.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 26 12:44:01 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 93 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=10.893328667s) [0] r=-1 lpr=93 pi=[67,93)/1 crt=48'1155 mlcod 0'0 active pruub 116.255325317s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:01 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 93 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=10.893221855s) [0] r=-1 lpr=93 pi=[67,93)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 116.255325317s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:01 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 93 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=10.890148163s) [0] r=-1 lpr=93 pi=[67,93)/1 crt=48'1155 mlcod 0'0 active pruub 116.253051758s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:01 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 93 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=10.890123367s) [0] r=-1 lpr=93 pi=[67,93)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 116.253051758s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:01.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 26 12:44:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 26 12:44:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 26 12:44:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 26 12:44:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 26 12:44:02 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 94 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=94) [0]/[2] r=0 lpr=94 pi=[67,94)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:02 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 94 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=94) [0]/[2] r=0 lpr=94 pi=[67,94)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:02 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 94 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=94) [0]/[2] r=0 lpr=94 pi=[67,94)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:02 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 94 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=6 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=94) [0]/[2] r=0 lpr=94 pi=[67,94)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:03.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:03.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 26 12:44:03 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 95 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=94/95 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=94) [0]/[2] async=[0] r=0 lpr=94 pi=[67,94)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:44:03 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 95 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=94/95 n=6 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=94) [0]/[2] async=[0] r=0 lpr=94 pi=[67,94)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:44:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 26 12:44:04 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 96 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=94/95 n=5 ec=55/41 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.013239861s) [0] async=[0] r=-1 lpr=96 pi=[67,96)/1 crt=48'1155 mlcod 48'1155 active pruub 123.627639771s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:04 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 96 pg[9.1f( v 48'1155 (0'0,48'1155] local-lis/les=94/95 n=5 ec=55/41 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.012971878s) [0] r=-1 lpr=96 pi=[67,96)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 123.627639771s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:04 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 96 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=94/95 n=6 ec=55/41 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.016065598s) [0] async=[0] r=-1 lpr=96 pi=[67,96)/1 crt=48'1155 mlcod 48'1155 active pruub 123.631370544s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:04 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 96 pg[9.f( v 48'1155 (0'0,48'1155] local-lis/les=94/95 n=6 ec=55/41 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.015957832s) [0] r=-1 lpr=96 pi=[67,96)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 123.631370544s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:05.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:05.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 26 12:44:06 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Jan 26 12:44:06 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Jan 26 12:44:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:07.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:07.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 26 12:44:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 26 12:44:09 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 26 12:44:09 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 26 12:44:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:09.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:09.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 26 12:44:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 26 12:44:10 np0005596062 systemd-logind[781]: New session 33 of user zuul.
Jan 26 12:44:10 np0005596062 systemd[1]: Started Session 33 of User zuul.
Jan 26 12:44:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 26 12:44:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 26 12:44:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 26 12:44:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:11.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:11 np0005596062 python3.9[86856]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:44:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:11.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:12 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 26 12:44:12 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 26 12:44:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 26 12:44:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 26 12:44:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:13.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:13.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:14 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Jan 26 12:44:14 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Jan 26 12:44:14 np0005596062 python3.9[87122]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:44:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 26 12:44:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 26 12:44:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:15.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:17.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:17.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 26 12:44:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 26 12:44:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 6.1 deep-scrub starts
Jan 26 12:44:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 6.1 deep-scrub ok
Jan 26 12:44:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:19.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 26 12:44:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 26 12:44:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 26 12:44:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:19.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 26 12:44:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:21.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 26 12:44:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 26 12:44:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:21.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:22 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 26 12:44:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 26 12:44:22 np0005596062 systemd[1]: session-33.scope: Deactivated successfully.
Jan 26 12:44:22 np0005596062 systemd[1]: session-33.scope: Consumed 8.789s CPU time.
Jan 26 12:44:22 np0005596062 systemd-logind[781]: Session 33 logged out. Waiting for processes to exit.
Jan 26 12:44:22 np0005596062 systemd-logind[781]: Removed session 33.
Jan 26 12:44:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:23.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:23.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 26 12:44:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 26 12:44:25 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.d deep-scrub starts
Jan 26 12:44:25 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.d deep-scrub ok
Jan 26 12:44:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:25.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:25.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:26 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 26 12:44:26 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 26 12:44:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:27.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:27.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 26 12:44:28 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 111 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=111 pruub=9.631876945s) [0] r=-1 lpr=111 pi=[72,111)/1 crt=48'1155 mlcod 0'0 active pruub 141.657012939s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:28 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 111 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=111 pruub=9.631232262s) [0] r=-1 lpr=111 pi=[72,111)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 141.657012939s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 26 12:44:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:29.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 26 12:44:29 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 112 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=112) [0]/[2] r=0 lpr=112 pi=[72,112)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:29 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 112 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=72/73 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=112) [0]/[2] r=0 lpr=112 pi=[72,112)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:29 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=74/74 les/c/f=75/75/0 sis=112) [2] r=0 lpr=112 pi=[74,112)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:29.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 26 12:44:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 26 12:44:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 26 12:44:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 26 12:44:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 113 pg[9.16( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=74/74 les/c/f=75/75/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[74,113)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 113 pg[9.16( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=74/74 les/c/f=75/75/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[74,113)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:30 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 113 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=112/113 n=5 ec=55/41 lis/c=72/72 les/c/f=73/73/0 sis=112) [0]/[2] async=[0] r=0 lpr=112 pi=[72,112)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:44:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 26 12:44:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:31.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:31.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:31 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 26 12:44:31 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 26 12:44:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 26 12:44:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 114 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=112/113 n=5 ec=55/41 lis/c=112/72 les/c/f=113/73/0 sis=114 pruub=14.760596275s) [0] async=[0] r=-1 lpr=114 pi=[72,114)/1 crt=48'1155 mlcod 48'1155 active pruub 150.957565308s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:32 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 114 pg[9.15( v 48'1155 (0'0,48'1155] local-lis/les=112/113 n=5 ec=55/41 lis/c=112/72 les/c/f=113/73/0 sis=114 pruub=14.760497093s) [0] r=-1 lpr=114 pi=[72,114)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 150.957565308s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:32 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.e deep-scrub starts
Jan 26 12:44:32 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.e deep-scrub ok
Jan 26 12:44:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 26 12:44:33 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 115 pg[9.16( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=113/74 les/c/f=114/75/0 sis=115) [2] r=0 lpr=115 pi=[74,115)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:33 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 115 pg[9.16( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=113/74 les/c/f=114/75/0 sis=115) [2] r=0 lpr=115 pi=[74,115)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:33.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:33.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 26 12:44:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 26 12:44:34 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 116 pg[9.16( v 48'1155 (0'0,48'1155] local-lis/les=115/116 n=5 ec=55/41 lis/c=113/74 les/c/f=114/75/0 sis=115) [2] r=0 lpr=115 pi=[74,115)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:44:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:35.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:35.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:36 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Jan 26 12:44:37 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Jan 26 12:44:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:37.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:37.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:38 np0005596062 systemd-logind[781]: New session 34 of user zuul.
Jan 26 12:44:38 np0005596062 systemd[1]: Started Session 34 of User zuul.
Jan 26 12:44:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:39 np0005596062 python3.9[87394]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 12:44:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:39.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 26 12:44:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:39.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 26 12:44:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 26 12:44:40 np0005596062 python3.9[87569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:44:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 26 12:44:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 26 12:44:41 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 118 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=85/86 n=5 ec=55/41 lis/c=85/85 les/c/f=86/86/0 sis=118 pruub=8.225731850s) [1] r=-1 lpr=118 pi=[85,118)/1 crt=48'1155 mlcod 0'0 active pruub 153.359725952s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:41 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 118 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=85/86 n=5 ec=55/41 lis/c=85/85 les/c/f=86/86/0 sis=118 pruub=8.224906921s) [1] r=-1 lpr=118 pi=[85,118)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 153.359725952s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:41.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:41.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:41 np0005596062 python3.9[87725]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:44:41 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Jan 26 12:44:41 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Jan 26 12:44:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 26 12:44:42 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 119 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=85/86 n=5 ec=55/41 lis/c=85/85 les/c/f=86/86/0 sis=119) [1]/[2] r=0 lpr=119 pi=[85,119)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:42 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 119 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=85/86 n=5 ec=55/41 lis/c=85/85 les/c/f=86/86/0 sis=119) [1]/[2] r=0 lpr=119 pi=[85,119)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:42 np0005596062 python3.9[87879]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:44:42 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 26 12:44:42 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 26 12:44:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 26 12:44:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:43.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:43.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:43 np0005596062 podman[88204]: 2026-01-26 17:44:43.755072404 +0000 UTC m=+0.077884069 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 26 12:44:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:43 np0005596062 python3.9[88189]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:44:43 np0005596062 podman[88204]: 2026-01-26 17:44:43.846059588 +0000 UTC m=+0.168871243 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:44:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 26 12:44:44 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 120 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=119/120 n=5 ec=55/41 lis/c=85/85 les/c/f=86/86/0 sis=119) [1]/[2] async=[1] r=0 lpr=119 pi=[85,119)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:44:44 np0005596062 podman[88511]: 2026-01-26 17:44:44.687409301 +0000 UTC m=+0.087922434 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:44:44 np0005596062 podman[88511]: 2026-01-26 17:44:44.723530216 +0000 UTC m=+0.124043269 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:44:44 np0005596062 python3.9[88507]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:44:45 np0005596062 podman[88599]: 2026-01-26 17:44:45.03932629 +0000 UTC m=+0.086552958 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, name=keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, release=1793, version=2.2.4, build-date=2023-02-22T09:23:20, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived)
Jan 26 12:44:45 np0005596062 podman[88599]: 2026-01-26 17:44:45.057403847 +0000 UTC m=+0.104630465 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, com.redhat.component=keepalived-container, name=keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vcs-type=git, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, release=1793)
Jan 26 12:44:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 26 12:44:45 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 121 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=119/120 n=5 ec=55/41 lis/c=119/85 les/c/f=120/86/0 sis=121 pruub=15.092225075s) [1] async=[1] r=-1 lpr=121 pi=[85,121)/1 crt=48'1155 mlcod 48'1155 active pruub 164.196792603s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:45 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 121 pg[9.19( v 48'1155 (0'0,48'1155] local-lis/les=119/120 n=5 ec=55/41 lis/c=119/85 les/c/f=120/86/0 sis=121 pruub=15.092044830s) [1] r=-1 lpr=121 pi=[85,121)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 164.196792603s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:45.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:45.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:45 np0005596062 python3.9[88829]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:44:45 np0005596062 network[88873]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:44:45 np0005596062 network[88874]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:44:45 np0005596062 network[88875]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:44:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 26 12:44:46 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:46 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:46 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:46 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:46 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:44:46 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.a deep-scrub starts
Jan 26 12:44:46 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.a deep-scrub ok
Jan 26 12:44:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:44:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:47.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:47.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.279106) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449489279303, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7187, "num_deletes": 256, "total_data_size": 13220731, "memory_usage": 13508352, "flush_reason": "Manual Compaction"}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 26 12:44:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:49.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449489347283, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7857525, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 253, "largest_seqno": 7192, "table_properties": {"data_size": 7829382, "index_size": 18428, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 81242, "raw_average_key_size": 23, "raw_value_size": 7762401, "raw_average_value_size": 2254, "num_data_blocks": 813, "num_entries": 3443, "num_filter_entries": 3443, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 1769449303, "file_creation_time": 1769449489, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 68263 microseconds, and 33308 cpu microseconds.
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.347379) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7857525 bytes OK
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.347407) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.349488) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.349515) EVENT_LOG_v1 {"time_micros": 1769449489349507, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.349536) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 13183078, prev total WAL file size 13183078, number of live WAL files 2.
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.354819) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7673KB) 8(1648B)]
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449489354996, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7859173, "oldest_snapshot_seqno": -1}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3190 keys, 7853743 bytes, temperature: kUnknown
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449489423421, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7853743, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7826291, "index_size": 18382, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 77026, "raw_average_key_size": 24, "raw_value_size": 7762459, "raw_average_value_size": 2433, "num_data_blocks": 813, "num_entries": 3190, "num_filter_entries": 3190, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769449489, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.423787) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7853743 bytes
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.425209) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.7 rd, 114.6 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(7.5, 0.0 +0.0 blob) out(7.5 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3448, records dropped: 258 output_compression: NoCompression
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.425241) EVENT_LOG_v1 {"time_micros": 1769449489425226, "job": 4, "event": "compaction_finished", "compaction_time_micros": 68516, "compaction_time_cpu_micros": 33091, "output_level": 6, "num_output_files": 1, "total_output_size": 7853743, "num_input_records": 3448, "num_output_records": 3190, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449489427990, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449489428066, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 26 12:44:49 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:44:49.354494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:44:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:49.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:49 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.b deep-scrub starts
Jan 26 12:44:49 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.b deep-scrub ok
Jan 26 12:44:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 26 12:44:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 26 12:44:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 26 12:44:50 np0005596062 python3.9[89170]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:44:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 26 12:44:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 26 12:44:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 26 12:44:51 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 125 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=125 pruub=8.758138657s) [1] r=-1 lpr=125 pi=[67,125)/1 crt=48'1155 mlcod 0'0 active pruub 164.254638672s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:51 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 125 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=125 pruub=8.758079529s) [1] r=-1 lpr=125 pi=[67,125)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 164.254638672s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:51 np0005596062 python3.9[89320]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:44:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 26 12:44:52 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 126 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=126) [1]/[2] r=0 lpr=126 pi=[67,126)/1 crt=48'1155 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:52 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 126 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=67/68 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=126) [1]/[2] r=0 lpr=126 pi=[67,126)/1 crt=48'1155 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 12:44:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 26 12:44:53 np0005596062 python3.9[89475]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:44:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:53.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:44:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:44:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 26 12:44:53 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 127 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=126/127 n=5 ec=55/41 lis/c=67/67 les/c/f=68/68/0 sis=126) [1]/[2] async=[1] r=0 lpr=126 pi=[67,126)/1 crt=48'1155 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:44:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:54 np0005596062 python3.9[89684]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:44:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 26 12:44:54 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 128 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=126/127 n=5 ec=55/41 lis/c=126/67 les/c/f=127/68/0 sis=128 pruub=14.861769676s) [1] async=[1] r=-1 lpr=128 pi=[67,128)/1 crt=48'1155 mlcod 48'1155 active pruub 173.591888428s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:44:54 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 128 pg[9.1b( v 48'1155 (0'0,48'1155] local-lis/les=126/127 n=5 ec=55/41 lis/c=126/67 les/c/f=127/68/0 sis=128 pruub=14.861682892s) [1] r=-1 lpr=128 pi=[67,128)/1 crt=48'1155 mlcod 0'0 unknown NOTIFY pruub 173.591888428s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 12:44:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:44:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:55.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:44:55 np0005596062 python3.9[89792]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:44:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:55.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:55 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.8 deep-scrub starts
Jan 26 12:44:55 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.8 deep-scrub ok
Jan 26 12:44:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:44:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 26 12:44:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:57.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:44:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:57.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:44:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 12:44:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:44:59.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:44:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:44:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:44:59.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:44:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 26 12:44:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 26 12:45:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 26 12:45:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:01.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:02 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 26 12:45:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 26 12:45:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:03.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 26 12:45:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 26 12:45:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 26 12:45:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:03.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 26 12:45:04 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 132 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=92/92 les/c/f=93/93/0 sis=131) [2] r=0 lpr=132 pi=[92,131)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:45:05 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 26 12:45:05 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 26 12:45:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:05.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 26 12:45:05 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 133 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=92/92 les/c/f=93/93/0 sis=133) [2]/[0] r=-1 lpr=133 pi=[92,133)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:45:05 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 133 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/41 lis/c=92/92 les/c/f=93/93/0 sis=133) [2]/[0] r=-1 lpr=133 pi=[92,133)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 12:45:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:05.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 12:45:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 12:45:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 26 12:45:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:07.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:07.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 26 12:45:07 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 135 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=133/92 les/c/f=134/93/0 sis=135) [2] r=0 lpr=135 pi=[92,135)/1 luod=0'0 crt=48'1155 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 26 12:45:07 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 135 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=0/0 n=5 ec=55/41 lis/c=133/92 les/c/f=134/93/0 sis=135) [2] r=0 lpr=135 pi=[92,135)/1 crt=48'1155 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 12:45:07 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Jan 26 12:45:07 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Jan 26 12:45:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 26 12:45:09 np0005596062 ceph-osd[79865]: osd.2 pg_epoch: 136 pg[9.1d( v 48'1155 (0'0,48'1155] local-lis/les=135/136 n=5 ec=55/41 lis/c=133/92 les/c/f=134/93/0 sis=135) [2] r=0 lpr=135 pi=[92,135)/1 crt=48'1155 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 12:45:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:09.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:09.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 26 12:45:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:11.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:11.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:13.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:13.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:15.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:45:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:15.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:45:16 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Jan 26 12:45:16 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Jan 26 12:45:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:17.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:17.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:17 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 26 12:45:17 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 26 12:45:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:19.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:19.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Jan 26 12:45:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Jan 26 12:45:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:21.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:21.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:21 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Jan 26 12:45:21 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Jan 26 12:45:22 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 26 12:45:22 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 26 12:45:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:23.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:23.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:24 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Jan 26 12:45:24 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Jan 26 12:45:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:25.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:25.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:25 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Jan 26 12:45:25 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Jan 26 12:45:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:27.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:27.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:28 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 26 12:45:28 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 26 12:45:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:29.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:29.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:29 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 26 12:45:30 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 26 12:45:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:31.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:31.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:31 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 26 12:45:31 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 26 12:45:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:33.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:33.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:34 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Jan 26 12:45:34 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Jan 26 12:45:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:35.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:35.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:37.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:37.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:39.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:39.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:41.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:41.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:41 np0005596062 python3.9[90237]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:45:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:43.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:43.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:43 np0005596062 python3.9[90525]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 12:45:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:44 np0005596062 python3.9[90678]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 12:45:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:45:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:45.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:45:45 np0005596062 python3.9[90830]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:45:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:45.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:46 np0005596062 python3.9[90983]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 12:45:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:47.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:47.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:47 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 26 12:45:47 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 26 12:45:48 np0005596062 python3.9[91136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:45:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:48 np0005596062 python3.9[91288]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:45:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:49.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:49 np0005596062 python3.9[91366]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:45:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:45:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:49.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:45:50 np0005596062 python3.9[91519]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:45:50 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 26 12:45:51 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 26 12:45:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:51.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:51.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:52 np0005596062 python3.9[91674]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 12:45:53 np0005596062 python3.9[91827]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 12:45:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:53.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:53.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:54 np0005596062 python3.9[91981]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 12:45:55 np0005596062 python3.9[92183]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 12:45:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:55.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:55.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:55 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.a scrub starts
Jan 26 12:45:55 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.a scrub ok
Jan 26 12:45:56 np0005596062 python3.9[92460]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:45:56 np0005596062 podman[92507]: 2026-01-26 17:45:56.318788876 +0000 UTC m=+0.095655672 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 26 12:45:56 np0005596062 podman[92507]: 2026-01-26 17:45:56.437985867 +0000 UTC m=+0.214852613 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 26 12:45:57 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 26 12:45:57 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 26 12:45:57 np0005596062 podman[92663]: 2026-01-26 17:45:57.315939278 +0000 UTC m=+0.077618139 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:45:57 np0005596062 podman[92663]: 2026-01-26 17:45:57.32720712 +0000 UTC m=+0.088885961 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:45:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:57.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:57 np0005596062 podman[92750]: 2026-01-26 17:45:57.553000974 +0000 UTC m=+0.062569786 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, version=2.2.4, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, architecture=x86_64, io.openshift.tags=Ceph keepalived, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vcs-type=git, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 12:45:57 np0005596062 podman[92750]: 2026-01-26 17:45:57.566150876 +0000 UTC m=+0.075719698 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64)
Jan 26 12:45:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:45:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:57.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:45:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:45:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:58 np0005596062 python3.9[92914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:45:59 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Jan 26 12:45:59 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Jan 26 12:45:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:45:59.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:59 np0005596062 python3.9[93182]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:45:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:45:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:45:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:45:59.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:45:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:45:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:45:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:45:59 np0005596062 python3.9[93277]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:46:00 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Jan 26 12:46:00 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Jan 26 12:46:01 np0005596062 python3.9[93430]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:46:01 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Jan 26 12:46:01 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Jan 26 12:46:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:01.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:01 np0005596062 python3.9[93508]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:46:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:01.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:01 np0005596062 systemd[72598]: Created slice User Background Tasks Slice.
Jan 26 12:46:01 np0005596062 systemd[72598]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 12:46:01 np0005596062 systemd[72598]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 12:46:02 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 26 12:46:02 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 26 12:46:02 np0005596062 python3.9[93662]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:46:03 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Jan 26 12:46:03 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Jan 26 12:46:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:03.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:03.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:04 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.f deep-scrub starts
Jan 26 12:46:04 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.f deep-scrub ok
Jan 26 12:46:05 np0005596062 python3.9[93814]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:46:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 12:46:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:05.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 12:46:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:05.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:06 np0005596062 python3.9[93967]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 12:46:06 np0005596062 python3.9[94117]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:46:07 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Jan 26 12:46:07 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Jan 26 12:46:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:07.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:07.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:08 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.3 deep-scrub starts
Jan 26 12:46:08 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 10.3 deep-scrub ok
Jan 26 12:46:08 np0005596062 python3.9[94270]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:46:08 np0005596062 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 12:46:08 np0005596062 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 12:46:08 np0005596062 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 12:46:08 np0005596062 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 12:46:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:08 np0005596062 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 12:46:09 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.b scrub starts
Jan 26 12:46:09 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.b scrub ok
Jan 26 12:46:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:46:09 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:46:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:09.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:09.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:09 np0005596062 python3.9[94481]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 12:46:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:11.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:11.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:13.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:13 np0005596062 python3.9[94635]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:46:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:13.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:14 np0005596062 python3.9[94790]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:46:15 np0005596062 systemd[1]: session-34.scope: Deactivated successfully.
Jan 26 12:46:15 np0005596062 systemd[1]: session-34.scope: Consumed 1min 8.598s CPU time.
Jan 26 12:46:15 np0005596062 systemd-logind[781]: Session 34 logged out. Waiting for processes to exit.
Jan 26 12:46:15 np0005596062 systemd-logind[781]: Removed session 34.
Jan 26 12:46:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:15.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:15.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:17.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:17.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:17 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Jan 26 12:46:18 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Jan 26 12:46:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:18 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 26 12:46:19 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 26 12:46:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:19.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:19.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:20 np0005596062 systemd-logind[781]: New session 35 of user zuul.
Jan 26 12:46:20 np0005596062 systemd[1]: Started Session 35 of User zuul.
Jan 26 12:46:21 np0005596062 python3.9[95023]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:46:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:21.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:21.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:22 np0005596062 python3.9[95180]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 12:46:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:23.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:23.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:23 np0005596062 python3.9[95333]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:46:24 np0005596062 python3.9[95418]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 12:46:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:25.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:25.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:27.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:27 np0005596062 python3.9[95572]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:46:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:27.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:28 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 26 12:46:29 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 26 12:46:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:29.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:29.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:31 np0005596062 python3.9[95727]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:46:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:31.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:31.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:32 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.17 deep-scrub starts
Jan 26 12:46:32 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.17 deep-scrub ok
Jan 26 12:46:32 np0005596062 python3.9[95881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:46:33 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Jan 26 12:46:33 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Jan 26 12:46:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:33.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:33 np0005596062 python3.9[96033]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 12:46:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:33.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:34 np0005596062 python3.9[96184]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:46:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:35.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:35.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:36 np0005596062 python3.9[96393]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:46:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:37.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:37.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:38 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 26 12:46:38 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 26 12:46:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:38 np0005596062 python3.9[96547]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:46:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:39.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:39.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:40 np0005596062 python3.9[96835]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 12:46:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:41.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:41.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:42 np0005596062 python3.9[96985]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:46:42 np0005596062 python3.9[97140]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:46:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:43.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:43.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:44 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 26 12:46:44 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 26 12:46:45 np0005596062 python3.9[97294]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:46:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:45.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:45.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:47 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 26 12:46:47 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 26 12:46:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:47.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:47.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:48 np0005596062 python3.9[97449]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:46:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:49.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:49.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:49 np0005596062 python3.9[97603]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 26 12:46:50 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 26 12:46:50 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 26 12:46:51 np0005596062 systemd-logind[781]: Session 35 logged out. Waiting for processes to exit.
Jan 26 12:46:51 np0005596062 systemd[1]: session-35.scope: Deactivated successfully.
Jan 26 12:46:51 np0005596062 systemd[1]: session-35.scope: Consumed 18.864s CPU time.
Jan 26 12:46:51 np0005596062 systemd-logind[781]: Removed session 35.
Jan 26 12:46:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:51.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:51.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:52 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 26 12:46:52 np0005596062 ceph-osd[79865]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 26 12:46:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:46:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:53.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:46:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:53.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:55.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:55.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:56 np0005596062 systemd-logind[781]: New session 36 of user zuul.
Jan 26 12:46:56 np0005596062 systemd[1]: Started Session 36 of User zuul.
Jan 26 12:46:57 np0005596062 python3.9[97835]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:46:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:57.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:57.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:58 np0005596062 python3.9[97990]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:46:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:46:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:46:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:46:59.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:46:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:46:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:46:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:46:59.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:46:59 np0005596062 python3.9[98183]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:47:00 np0005596062 systemd[1]: session-36.scope: Deactivated successfully.
Jan 26 12:47:00 np0005596062 systemd[1]: session-36.scope: Consumed 2.468s CPU time.
Jan 26 12:47:00 np0005596062 systemd-logind[781]: Session 36 logged out. Waiting for processes to exit.
Jan 26 12:47:00 np0005596062 systemd-logind[781]: Removed session 36.
Jan 26 12:47:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:01.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:01.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:03.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:05.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:05.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:06 np0005596062 systemd-logind[781]: New session 37 of user zuul.
Jan 26 12:47:06 np0005596062 systemd[1]: Started Session 37 of User zuul.
Jan 26 12:47:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:07.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:07 np0005596062 python3.9[98366]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:47:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:07.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:08 np0005596062 python3.9[98521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:47:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:09.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:09 np0005596062 podman[98829]: 2026-01-26 17:47:09.532124879 +0000 UTC m=+0.067237543 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 12:47:09 np0005596062 podman[98829]: 2026-01-26 17:47:09.621742208 +0000 UTC m=+0.156854772 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 12:47:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:09.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:09 np0005596062 python3.9[98865]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:47:10 np0005596062 podman[99014]: 2026-01-26 17:47:10.272140634 +0000 UTC m=+0.059293115 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:47:10 np0005596062 podman[99014]: 2026-01-26 17:47:10.282070564 +0000 UTC m=+0.069223015 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:47:10 np0005596062 podman[99150]: 2026-01-26 17:47:10.523201654 +0000 UTC m=+0.075588332 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, description=keepalived for Ceph, io.openshift.expose-services=, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Jan 26 12:47:10 np0005596062 podman[99150]: 2026-01-26 17:47:10.539556462 +0000 UTC m=+0.091943150 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, build-date=2023-02-22T09:23:20, version=2.2.4, release=1793, com.redhat.component=keepalived-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2)
Jan 26 12:47:10 np0005596062 python3.9[99158]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:47:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:11.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:47:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:47:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:47:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:47:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:47:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:11.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:13 np0005596062 python3.9[99469]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:47:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:13.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:13.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:15 np0005596062 python3.9[99715]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:15.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:16 np0005596062 python3.9[99868]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:47:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:17.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:17 np0005596062 python3.9[100032]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:47:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:17.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:18 np0005596062 python3.9[100110]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:18 np0005596062 python3.9[100263]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:47:19 np0005596062 python3.9[100341]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:47:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:47:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:19.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:47:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:19.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:47:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:47:20 np0005596062 python3.9[100517]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:47:21 np0005596062 python3.9[100696]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:47:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:21.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:21.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:21 np0005596062 python3.9[100848]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:47:22 np0005596062 python3.9[101001]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:47:23 np0005596062 python3.9[101153]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:47:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:47:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:23.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:47:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:23.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:25.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:25 np0005596062 python3.9[101307]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:47:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:25.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:26 np0005596062 python3.9[101462]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:47:27 np0005596062 python3.9[101614]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:47:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:27.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:27.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:28 np0005596062 python3.9[101767]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:47:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:29 np0005596062 python3.9[101920]: ansible-service_facts Invoked
Jan 26 12:47:29 np0005596062 network[101937]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:47:29 np0005596062 network[101938]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:47:29 np0005596062 network[101939]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:47:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:29.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:29.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:31.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:31.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:33.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:33.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:35 np0005596062 python3.9[102397]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:47:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:35.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:35.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:37.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:37.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:38 np0005596062 python3.9[102602]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 12:47:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:39.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:40 np0005596062 python3.9[102755]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:47:41 np0005596062 python3.9[102833]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:47:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:41.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:47:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:41.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:42 np0005596062 python3.9[102986]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:47:42 np0005596062 python3.9[103064]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:43.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:43.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:44 np0005596062 python3.9[103217]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:45.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:45.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:46 np0005596062 python3.9[103370]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:47:47 np0005596062 python3.9[103454]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:47:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:47.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:47.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:48 np0005596062 systemd[1]: session-37.scope: Deactivated successfully.
Jan 26 12:47:48 np0005596062 systemd[1]: session-37.scope: Consumed 25.815s CPU time.
Jan 26 12:47:48 np0005596062 systemd-logind[781]: Session 37 logged out. Waiting for processes to exit.
Jan 26 12:47:48 np0005596062 systemd-logind[781]: Removed session 37.
Jan 26 12:47:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:49.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:49.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:51.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:47:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:51.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:47:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:53.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:53.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:54 np0005596062 systemd-logind[781]: New session 38 of user zuul.
Jan 26 12:47:54 np0005596062 systemd[1]: Started Session 38 of User zuul.
Jan 26 12:47:55 np0005596062 python3.9[103640]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:47:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:55.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:47:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:55.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:56 np0005596062 python3.9[103842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:47:56 np0005596062 python3.9[103921]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:47:56 np0005596062 systemd[1]: session-38.scope: Deactivated successfully.
Jan 26 12:47:56 np0005596062 systemd[1]: session-38.scope: Consumed 1.846s CPU time.
Jan 26 12:47:56 np0005596062 systemd-logind[781]: Session 38 logged out. Waiting for processes to exit.
Jan 26 12:47:56 np0005596062 systemd-logind[781]: Removed session 38.
Jan 26 12:47:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:57.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:57.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.306516) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449678306585, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2632, "num_deletes": 251, "total_data_size": 5726656, "memory_usage": 5785920, "flush_reason": "Manual Compaction"}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449678333866, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3729791, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7197, "largest_seqno": 9824, "table_properties": {"data_size": 3719440, "index_size": 6269, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 25453, "raw_average_key_size": 21, "raw_value_size": 3696826, "raw_average_value_size": 3098, "num_data_blocks": 278, "num_entries": 1193, "num_filter_entries": 1193, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449490, "oldest_key_time": 1769449490, "file_creation_time": 1769449678, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 27974 microseconds, and 10592 cpu microseconds.
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.334489) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3729791 bytes OK
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.334517) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.337262) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.337285) EVENT_LOG_v1 {"time_micros": 1769449678337278, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.337310) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5714641, prev total WAL file size 5714641, number of live WAL files 2.
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.339905) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3642KB)], [15(7669KB)]
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449678339992, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11583534, "oldest_snapshot_seqno": -1}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3860 keys, 9993758 bytes, temperature: kUnknown
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449678689257, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9993758, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9961661, "index_size": 21363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9669, "raw_key_size": 93195, "raw_average_key_size": 24, "raw_value_size": 9885760, "raw_average_value_size": 2561, "num_data_blocks": 933, "num_entries": 3860, "num_filter_entries": 3860, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769449678, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.689664) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9993758 bytes
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.691978) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 33.2 rd, 28.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 7.5 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(5.8) write-amplify(2.7) OK, records in: 4383, records dropped: 523 output_compression: NoCompression
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.692021) EVENT_LOG_v1 {"time_micros": 1769449678691998, "job": 6, "event": "compaction_finished", "compaction_time_micros": 349385, "compaction_time_cpu_micros": 23762, "output_level": 6, "num_output_files": 1, "total_output_size": 9993758, "num_input_records": 4383, "num_output_records": 3860, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449678693529, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449678696686, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.339554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.696928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.696940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.696942) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.696944) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:47:58.696946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:47:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:47:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:47:59.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:47:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:47:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:47:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:47:59.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:01.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:01.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:02 np0005596062 systemd-logind[781]: New session 39 of user zuul.
Jan 26 12:48:02 np0005596062 systemd[1]: Started Session 39 of User zuul.
Jan 26 12:48:03 np0005596062 python3.9[104103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:48:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:03.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:03.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:04 np0005596062 python3.9[104260]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:05 np0005596062 python3.9[104435]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:05.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:05.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:05 np0005596062 python3.9[104513]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.gbfpqlnm recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:07 np0005596062 python3.9[104666]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:07 np0005596062 python3.9[104744]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.kz4kh6v4 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:07.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:07.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:08 np0005596062 python3.9[104897]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:48:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:09 np0005596062 python3.9[105049]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:09 np0005596062 python3.9[105127]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:48:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:09.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:09.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:10 np0005596062 python3.9[105280]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:10 np0005596062 python3.9[105358]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:48:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:11.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:11.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:12 np0005596062 python3.9[105511]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:13 np0005596062 python3.9[105663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:13.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:13 np0005596062 python3.9[105742]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:13.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:14 np0005596062 python3.9[105895]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:14 np0005596062 python3.9[105973]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:15.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:15.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:16 np0005596062 python3.9[106126]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:48:16 np0005596062 systemd[1]: Reloading.
Jan 26 12:48:16 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:48:16 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:48:17 np0005596062 python3.9[106366]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:17 np0005596062 python3.9[106444]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:17.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:17.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:18 np0005596062 python3.9[106597]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:18 np0005596062 python3.9[106675]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:19 np0005596062 python3.9[106827]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:48:19 np0005596062 systemd[1]: Reloading.
Jan 26 12:48:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:19.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:19 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:48:19 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:48:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:19.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:19 np0005596062 systemd[1]: Starting Create netns directory...
Jan 26 12:48:19 np0005596062 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 12:48:19 np0005596062 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 12:48:19 np0005596062 systemd[1]: Finished Create netns directory.
Jan 26 12:48:20 np0005596062 python3.9[107120]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:48:20 np0005596062 network[107189]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:48:20 np0005596062 network[107193]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:48:20 np0005596062 network[107195]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:48:21 np0005596062 podman[107214]: 2026-01-26 17:48:21.040378609 +0000 UTC m=+0.073950694 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:48:21 np0005596062 podman[107214]: 2026-01-26 17:48:21.145062706 +0000 UTC m=+0.178634771 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507)
Jan 26 12:48:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:21.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:21.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:22 np0005596062 podman[107402]: 2026-01-26 17:48:22.162524383 +0000 UTC m=+0.056257630 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:48:22 np0005596062 podman[107402]: 2026-01-26 17:48:22.43914726 +0000 UTC m=+0.332880517 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:48:22 np0005596062 podman[107476]: 2026-01-26 17:48:22.789080945 +0000 UTC m=+0.180893802 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vcs-type=git, architecture=x86_64, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived)
Jan 26 12:48:22 np0005596062 podman[107496]: 2026-01-26 17:48:22.887977797 +0000 UTC m=+0.076542884 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph)
Jan 26 12:48:23 np0005596062 podman[107476]: 2026-01-26 17:48:23.234945463 +0000 UTC m=+0.626758280 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vcs-type=git, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=)
Jan 26 12:48:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:23.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:23.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:48:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:48:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:48:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:48:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:48:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:25.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:25.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:27 np0005596062 python3.9[107861]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:27 np0005596062 python3.9[107939]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:27.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:27.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:29 np0005596062 python3.9[108092]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:29.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:30 np0005596062 python3.9[108245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:30 np0005596062 python3.9[108323]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:31.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:48:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:48:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:48:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:31.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:48:32 np0005596062 python3.9[108526]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 12:48:32 np0005596062 systemd[1]: Starting Time & Date Service...
Jan 26 12:48:32 np0005596062 systemd[1]: Started Time & Date Service.
Jan 26 12:48:33 np0005596062 python3.9[108682]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:33.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:33 np0005596062 python3.9[108834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:33.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:34 np0005596062 python3.9[108913]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:34 np0005596062 python3.9[109065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:35 np0005596062 python3.9[109143]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zuqtxpw3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:35.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:35.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:36 np0005596062 python3.9[109308]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:36 np0005596062 python3.9[109424]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:37 np0005596062 python3.9[109576]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:48:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:37.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:37.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:38 np0005596062 python3[109730]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 12:48:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:39 np0005596062 python3.9[109882]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:39.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:39 np0005596062 python3.9[109960]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:48:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:39.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:48:40 np0005596062 python3.9[110113]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:41 np0005596062 python3.9[110238]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449719.9472706-902-74141765728810/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.489660) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449721489745, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 652, "num_deletes": 250, "total_data_size": 1183400, "memory_usage": 1210104, "flush_reason": "Manual Compaction"}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449721495497, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 552031, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9829, "largest_seqno": 10476, "table_properties": {"data_size": 549077, "index_size": 926, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7462, "raw_average_key_size": 19, "raw_value_size": 543001, "raw_average_value_size": 1440, "num_data_blocks": 40, "num_entries": 377, "num_filter_entries": 377, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449679, "oldest_key_time": 1769449679, "file_creation_time": 1769449721, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 5881 microseconds, and 2425 cpu microseconds.
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.495548) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 552031 bytes OK
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.495565) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.497425) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.497447) EVENT_LOG_v1 {"time_micros": 1769449721497440, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.497465) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1179834, prev total WAL file size 1179834, number of live WAL files 2.
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.498050) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(539KB)], [18(9759KB)]
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449721498123, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10545789, "oldest_snapshot_seqno": -1}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3737 keys, 7811912 bytes, temperature: kUnknown
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449721560534, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7811912, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7783778, "index_size": 17720, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9349, "raw_key_size": 91122, "raw_average_key_size": 24, "raw_value_size": 7713051, "raw_average_value_size": 2063, "num_data_blocks": 773, "num_entries": 3737, "num_filter_entries": 3737, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769449721, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.560953) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7811912 bytes
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.562768) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 168.7 rd, 125.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 9.5 +0.0 blob) out(7.5 +0.0 blob), read-write-amplify(33.3) write-amplify(14.2) OK, records in: 4237, records dropped: 500 output_compression: NoCompression
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.562807) EVENT_LOG_v1 {"time_micros": 1769449721562790, "job": 8, "event": "compaction_finished", "compaction_time_micros": 62498, "compaction_time_cpu_micros": 18538, "output_level": 6, "num_output_files": 1, "total_output_size": 7811912, "num_input_records": 4237, "num_output_records": 3737, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449721563232, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449721565159, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.497894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.565218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.565224) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.565228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.565229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:48:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:48:41.565231) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:48:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:41.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:41.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:41 np0005596062 python3.9[110390]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:42 np0005596062 python3.9[110469]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:43 np0005596062 python3.9[110621]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:43 np0005596062 python3.9[110699]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:43.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:43.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:44 np0005596062 python3.9[110852]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:48:45 np0005596062 python3.9[110930]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:45.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:45.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:46 np0005596062 python3.9[111083]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:48:47 np0005596062 python3.9[111238]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:47.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:47.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:48 np0005596062 python3.9[111391]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:49 np0005596062 python3.9[111543]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:48:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:49.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:49.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:50 np0005596062 python3.9[111696]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 12:48:51 np0005596062 python3.9[111848]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 12:48:51 np0005596062 systemd-logind[781]: Session 39 logged out. Waiting for processes to exit.
Jan 26 12:48:51 np0005596062 systemd[1]: session-39.scope: Deactivated successfully.
Jan 26 12:48:51 np0005596062 systemd[1]: session-39.scope: Consumed 31.845s CPU time.
Jan 26 12:48:51 np0005596062 systemd-logind[781]: Removed session 39.
Jan 26 12:48:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:51.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:51.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:53.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:53.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:55.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:55.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:57 np0005596062 systemd-logind[781]: New session 40 of user zuul.
Jan 26 12:48:57 np0005596062 systemd[1]: Started Session 40 of User zuul.
Jan 26 12:48:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:48:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:57.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:48:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:57.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:58 np0005596062 python3.9[112081]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 12:48:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:48:59 np0005596062 python3.9[112234]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:48:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:48:59.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:48:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:48:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:48:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:48:59.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:00 np0005596062 python3.9[112389]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 26 12:49:01 np0005596062 python3.9[112541]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.vta71gc4 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:01.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:01 np0005596062 python3.9[112666]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.vta71gc4 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449740.6587274-109-52055319715836/.source.vta71gc4 _original_basename=.a3pctue5 follow=False checksum=a595db097bfa207c8b20f83c8c918987b40a76ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:01.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:02 np0005596062 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 12:49:02 np0005596062 python3.9[112821]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:49:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:49:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:03.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:49:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:03.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:03 np0005596062 python3.9[112973]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC8WJSSyps5/MOwluaYVKvHLbB3OOMaGha+S5zKQqPSAcedSyuyvzK3GC+qad2ZbcfCfiNZHWM+ylBueRDL14BxpBXCAqNKHN1Yo1Fvlb4JCkcbhbgkVGemDEsbBiNmTtSlxRI40uI8M0+E42b22Zh7qz1PC1XmS0po5y6SwzcfgbnZtuyVFsvGHqDWkkWV/gsjiZ57qMaC+DJaIhvfW+qObinKJqXeuPQbF6yjfhXPHf2nwYEGY9rM5zEvZyfC/Dnrg62lDFjq4LGLrb83ipcBQq+zMejeECDs/u6noWAMs8f5HcxW0zembv86K5pOtPJKA13xVImv+kfGS+EctaKEBB/ooqOhN9AdXFEJUuSDn/2iUm07NnrEN9WhrfiuxLCO/lBWwxFGKcQECRviuCwE51F4fVEduv4ZiDgPcsHo+fYbxXsG50xc8/Yumd+a60pkpu09wVk1P3fCbFbRd9kD4elm067blILF+Zs+YuWnuaK3LiCb+qzmDKQB4AArubE=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILePT0ow4c3ejDoUzP/5T/dIHfr1xTtwEP/2z/Lf68vz#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB0tLEQbxQsuF0gTFyU7HBbMRjNrt7rMl1+QXcK3yfs0Q29raINYHrTVwzWeSuTUiO464HBZr4aPyLzhd+2Z3xs=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0zaLI2LTbNOyYJLCkCHwBNvCbWxyjbFipOdeKx9WVOSI6BraalDHlRpumUYDm8JC8abEq1qaZCBLmxjPXdZu5OGr/kPmf6SKEUmhy4iVIlqya8lpE59ci/zJO3FmNG+BncaGfJAQ0wqUgfNc/27u/wxD+gMrd6Ocz1dRHjtV22N4KnHAZP+sb0G1LZUx4WhJ07B4r/YaWeXOL2puHk0zHfnxSMIyyEvTlx9zlqSArxDuyq6AA7skTmkIlIC7eYbws7R3oP5PdtDl0sj1SEaTS4uAOSxbcYCV3H/IBa5evA+pxo7m3gf2YQ/QsGcfMQF4GefF3pWfZN0BGK7DWb3bckv62Oq9geYx47ccajXIEt3vsncvsrZhozX5OPyxW4eLJ8r7ovCX+5uGTuF9LrmwDdc7XRJ7rXBWSKh66/yxUcPGEQIk7OoEA30ZmKeipyMJQHHrWKxAqkqz6+ZQ41KvXaFIB1lRQf4tlFTAfrm9xwChyoCfrU95QYM4V+zqCQ6E=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILbMiL3+EkWDKAQHi9JT5Xqvk8rNrdT5SVX2Gg2RyqsV#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLVPallz3Z+vrxzfd9Dxuo/G10ZpIDOna2ftaoWWaEiUQrn77C3vB8d1zHHnHxMi8qaS4W4lfA32FenhGfBnVVU=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCOz38rnMu5RPID5R9a4AOkL2Ge6a4dzWxjmOZKIbuidITYge9lyZ+ThI161k8ZELWw9SBoQvNwVmySyCRLJH9qPhNCVmEqUqZJohUEZQ+lNpyZk3JkhZsgLTYjkdV/DPqp3iLlV/asPhl18j+CFKmN5Dx0qMsAg1f9CbOZwhdgeVEeB3IqdjBrPIMgAwVlacU9ty90SAUJj+RoMZePfAh7i2q7VTPHcvKRA1Mz4Q+RRKojI3DfR0se9vFL9KYNhD/O0JbAZksdom7tVuZ6LjcyIYqBUeB2jYwSO66sVFNWI4JwFEr5OOb1EiOGWGudWuZVfdeD+TYeZk0hco2GhtmXBVDWWeYQNNXAKRcQ7aM2y9SlN6gOKzJq08LuoShMOl8IuErTDV7Cp3WpuPPqDc5gv0swDVoOXsbju1Bxm2aLE7d1GiJbuhLS+pvIgc0MrnyOhUrTGTAdyfZ4gsw6BekK5Gf22C6xvZ865/N5LCr5jahKtqujZ6X6sECNsBQ1j0M=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOdmNmdvqfqzPDx4l6nvkEw8mwn78xc6LydRgAb6QEGT#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKb0RFR0G0BOVptSrXD3m/y/AD2q+whTWANps4FtvEcdq4zrHxHJM7JO/mkAyT4VEcyt7wmguNEWF5NqwEZeFZ4=#012 create=True mode=0644 path=/tmp/ansible.vta71gc4 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:04 np0005596062 python3.9[113126]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.vta71gc4' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:49:05 np0005596062 python3.9[113280]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.vta71gc4 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:05.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:05.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:05 np0005596062 systemd-logind[781]: Session 40 logged out. Waiting for processes to exit.
Jan 26 12:49:05 np0005596062 systemd[1]: session-40.scope: Deactivated successfully.
Jan 26 12:49:05 np0005596062 systemd[1]: session-40.scope: Consumed 5.363s CPU time.
Jan 26 12:49:05 np0005596062 systemd-logind[781]: Removed session 40.
Jan 26 12:49:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:07.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:07.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:09.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:09.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:11 np0005596062 systemd-logind[781]: New session 41 of user zuul.
Jan 26 12:49:11 np0005596062 systemd[1]: Started Session 41 of User zuul.
Jan 26 12:49:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:49:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:11.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:49:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:11.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:12 np0005596062 python3.9[113463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:49:13 np0005596062 python3.9[113619]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 12:49:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:13.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:13.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:14 np0005596062 python3.9[113774]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:49:15 np0005596062 python3.9[113927]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:49:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:15.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:15.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:16 np0005596062 python3.9[114112]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:49:17 np0005596062 python3.9[114283]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:17.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:17 np0005596062 systemd[1]: session-41.scope: Deactivated successfully.
Jan 26 12:49:17 np0005596062 systemd[1]: session-41.scope: Consumed 4.214s CPU time.
Jan 26 12:49:17 np0005596062 systemd-logind[781]: Session 41 logged out. Waiting for processes to exit.
Jan 26 12:49:17 np0005596062 systemd-logind[781]: Removed session 41.
Jan 26 12:49:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:49:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:17.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:49:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:19.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:19.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:21.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:21.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:23 np0005596062 systemd-logind[781]: New session 42 of user zuul.
Jan 26 12:49:23 np0005596062 systemd[1]: Started Session 42 of User zuul.
Jan 26 12:49:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:23.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:23.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:24 np0005596062 python3.9[114466]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:49:25 np0005596062 python3.9[114622]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:49:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:25.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:25.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:26 np0005596062 python3.9[114707]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 12:49:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:27.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:27.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:28 np0005596062 python3.9[114859]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:49:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:29.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:29.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:30 np0005596062 python3.9[115011]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 12:49:30 np0005596062 python3.9[115161]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:49:31 np0005596062 python3.9[115387]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:49:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:31.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:31.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:32 np0005596062 podman[115508]: 2026-01-26 17:49:32.048892683 +0000 UTC m=+0.069291481 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Jan 26 12:49:32 np0005596062 systemd[1]: session-42.scope: Deactivated successfully.
Jan 26 12:49:32 np0005596062 systemd[1]: session-42.scope: Consumed 6.196s CPU time.
Jan 26 12:49:32 np0005596062 systemd-logind[781]: Session 42 logged out. Waiting for processes to exit.
Jan 26 12:49:32 np0005596062 systemd-logind[781]: Removed session 42.
Jan 26 12:49:32 np0005596062 podman[115508]: 2026-01-26 17:49:32.171113642 +0000 UTC m=+0.191512440 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0)
Jan 26 12:49:33 np0005596062 podman[115665]: 2026-01-26 17:49:33.135939116 +0000 UTC m=+0.059647457 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:49:33 np0005596062 podman[115665]: 2026-01-26 17:49:33.152212754 +0000 UTC m=+0.075921095 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 12:49:33 np0005596062 podman[115732]: 2026-01-26 17:49:33.390177482 +0000 UTC m=+0.063880618 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, release=1793, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph)
Jan 26 12:49:33 np0005596062 podman[115732]: 2026-01-26 17:49:33.401332285 +0000 UTC m=+0.075035401 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc.)
Jan 26 12:49:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:33.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:33.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:49:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:49:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:49:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:49:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:49:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:35.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:35.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:37 np0005596062 systemd-logind[781]: New session 43 of user zuul.
Jan 26 12:49:37 np0005596062 systemd[1]: Started Session 43 of User zuul.
Jan 26 12:49:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:37.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:37.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:38 np0005596062 python3.9[116102]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:49:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:39.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:39.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:40 np0005596062 python3.9[116259]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:41 np0005596062 python3.9[116411]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:41.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:41.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:41 np0005596062 python3.9[116563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:49:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:49:42 np0005596062 python3.9[116737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449781.5192838-159-79493048554951/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d94398b6c1ee5c6beae0155d542ee06316ab7af2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:43 np0005596062 python3.9[116889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:43.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:43 np0005596062 python3.9[117012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449782.878497-159-46614222269294/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7f2574ee4e0949b6273e9da9f87244a58b33ceaa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:43.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:44 np0005596062 python3.9[117165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:45 np0005596062 python3.9[117288]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449784.0478683-159-168140560130806/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b98bc3711d90af040dff4b3ade233f4f11577ac4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:45.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:45 np0005596062 python3.9[117440]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:45.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:46 np0005596062 python3.9[117593]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:47 np0005596062 python3.9[117745]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:47 np0005596062 python3.9[117868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449786.6554565-336-109605827468057/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=e9d933165fe438ce0e4582af2375e63e81e162a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:47.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:47.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:48 np0005596062 python3.9[118021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:48 np0005596062 python3.9[118144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449787.9461505-336-76252654916155/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=b45673754a8986e9f47c98c3e35456cb9dfc3d3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:49 np0005596062 python3.9[118296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:49.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:49.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:50 np0005596062 python3.9[118420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449789.0734253-336-134151886026018/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=c462ccca77e70968dd91d4c2c5c42b126caa2002 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:50 np0005596062 python3.9[118572]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:51 np0005596062 python3.9[118724]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:51.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:51.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:52 np0005596062 python3.9[118877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:52 np0005596062 python3.9[119000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449791.7938855-518-161279452579048/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=5beff5b10d98fd207d0ec45c4b7707f0798fdb41 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:53 np0005596062 python3.9[119152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:53.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:53.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:54 np0005596062 python3.9[119276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449793.0768542-518-181715612879899/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=b45673754a8986e9f47c98c3e35456cb9dfc3d3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:54 np0005596062 python3.9[119428]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:55 np0005596062 python3.9[119551]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449794.3976748-518-60984836240081/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=2d29fbb28d8456df28263ca6807e54bf6fb4c98f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:55.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:55.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:49:56 np0005596062 python3.9[119704]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:57 np0005596062 python3.9[119906]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:57.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:57 np0005596062 python3.9[120029]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449796.8011565-727-259433753691868/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:49:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:57.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:58 np0005596062 python3.9[120182]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:49:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:49:59 np0005596062 python3.9[120334]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:49:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:49:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:49:59.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:49:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:49:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:49:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:49:59.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:00 np0005596062 python3.9[120458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449798.9788582-806-15036119688605/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:00 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 12:50:00 np0005596062 python3.9[120610]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:50:01 np0005596062 python3.9[120762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:01.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:01.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:02 np0005596062 python3.9[120886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449801.3520393-885-208251953181062/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:03 np0005596062 python3.9[121038]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:50:03 np0005596062 python3.9[121192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:03.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:03.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:04 np0005596062 python3.9[121316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449803.2826796-958-280187857203831/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:05 np0005596062 python3.9[121468]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:50:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:05.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:05 np0005596062 python3.9[121620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:05.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:06 np0005596062 python3.9[121744]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449805.3804977-1034-27582158743071/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:07 np0005596062 python3.9[121896]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:50:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:07.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:07 np0005596062 python3.9[122048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:07.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:08 np0005596062 python3.9[122172]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449807.3686357-1102-4524767179993/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=9c020ad993969d6201452a9427187b11fbbe4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:08 np0005596062 systemd[1]: session-43.scope: Deactivated successfully.
Jan 26 12:50:08 np0005596062 systemd[1]: session-43.scope: Consumed 25.255s CPU time.
Jan 26 12:50:08 np0005596062 systemd-logind[781]: Session 43 logged out. Waiting for processes to exit.
Jan 26 12:50:08 np0005596062 systemd-logind[781]: Removed session 43.
Jan 26 12:50:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:09.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:09.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:11.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:11.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:13.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:13.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:14 np0005596062 systemd-logind[781]: New session 44 of user zuul.
Jan 26 12:50:14 np0005596062 systemd[1]: Started Session 44 of User zuul.
Jan 26 12:50:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:15 np0005596062 python3.9[122356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:15.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:15 np0005596062 python3.9[122508]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:15.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:16 np0005596062 python3.9[122632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449815.2634852-64-31739121279138/.source.conf _original_basename=ceph.conf follow=False checksum=1cb6012f361f0c9e471f352b73a07eaa73c38d31 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:17.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:17.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:17 np0005596062 python3.9[122834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:18 np0005596062 python3.9[122958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449817.4087255-64-87233164198110/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=395d1c083c7c30077cae22673689037cb8c534c6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:19 np0005596062 systemd[1]: session-44.scope: Deactivated successfully.
Jan 26 12:50:19 np0005596062 systemd[1]: session-44.scope: Consumed 2.991s CPU time.
Jan 26 12:50:19 np0005596062 systemd-logind[781]: Session 44 logged out. Waiting for processes to exit.
Jan 26 12:50:19 np0005596062 systemd-logind[781]: Removed session 44.
Jan 26 12:50:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:19.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:19.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:21.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:21.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:23.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:23.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:25 np0005596062 systemd-logind[781]: New session 45 of user zuul.
Jan 26 12:50:25 np0005596062 systemd[1]: Started Session 45 of User zuul.
Jan 26 12:50:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:25.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:25.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:26 np0005596062 python3.9[123140]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:50:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:27.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:27.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:28 np0005596062 python3.9[123297]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:50:28 np0005596062 python3.9[123449]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:50:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:29 np0005596062 python3.9[123599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:50:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:29.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:30 np0005596062 python3.9[123752]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 12:50:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:31.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:31.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:33.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:33.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:35.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:36 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 26 12:50:36 np0005596062 python3.9[123911]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:50:37 np0005596062 python3.9[124045]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:50:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:37.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:37.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:39.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:39.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:40 np0005596062 python3.9[124199]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:50:41 np0005596062 python3[124355]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 26 12:50:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:41.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:41.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:42 np0005596062 python3.9[124507]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:42 np0005596062 python3.9[124792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:43 np0005596062 python3.9[124870]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:50:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:43 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:50:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:43.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:43.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:44 np0005596062 python3.9[125023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:44 np0005596062 python3.9[125101]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6nj6l3xj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:45 np0005596062 python3.9[125253]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:45.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:45 np0005596062 python3.9[125331]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:45.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:46 np0005596062 python3.9[125484]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:50:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:47.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:47 np0005596062 python3[125637]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 12:50:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:47.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:48 np0005596062 python3.9[125790]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:49 np0005596062 python3.9[125915]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449848.1073337-434-124893677588523/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:49.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:50:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:49.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:50 np0005596062 python3.9[126068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:50 np0005596062 python3.9[126193]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449849.6617985-480-248181225172123/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:51 np0005596062 python3.9[126345]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:51.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:51.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:52 np0005596062 python3.9[126471]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449851.1616566-524-129493607769367/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:53 np0005596062 python3.9[126623]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:53 np0005596062 python3.9[126749]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449852.6288016-569-97052464302421/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:53.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:53.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:54 np0005596062 python3.9[126903]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:50:55 np0005596062 python3.9[127028]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769449854.049927-614-278694015884363/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:55.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:56.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:56 np0005596062 python3.9[127181]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:56 np0005596062 python3.9[127333]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:50:57 np0005596062 python3.9[127588]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:50:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:50:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:57.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:50:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:50:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:50:58.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:50:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:50:58 np0005596062 python3.9[127741]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:50:59 np0005596062 python3.9[127894]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:50:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:50:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:50:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:50:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:50:59.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:00.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:00 np0005596062 python3.9[128049]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:00 np0005596062 python3.9[128204]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:51:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:01.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:02.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:02 np0005596062 python3.9[128355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:51:03 np0005596062 python3.9[128508]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:03 np0005596062 ovs-vsctl[128509]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 26 12:51:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:04.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:04 np0005596062 python3.9[128662]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:05 np0005596062 python3.9[128817]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:05 np0005596062 ovs-vsctl[128818]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 26 12:51:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:51:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:06.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:06 np0005596062 python3.9[128969]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:51:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:07.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:08.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:08 np0005596062 python3.9[129124]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:51:09 np0005596062 python3.9[129276]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:09 np0005596062 python3.9[129354]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:51:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:09.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:10.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:10 np0005596062 python3.9[129507]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:10 np0005596062 python3.9[129585]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:51:11 np0005596062 python3.9[129737]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:51:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:11.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:12.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:12 np0005596062 python3.9[129890]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:12 np0005596062 python3.9[129968]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:13 np0005596062 python3.9[130120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:13.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:14.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:14 np0005596062 python3.9[130199]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:15 np0005596062 python3.9[130351]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:51:15 np0005596062 systemd[1]: Reloading.
Jan 26 12:51:15 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:51:15 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:51:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:15.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:16.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:17 np0005596062 python3.9[130543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:17 np0005596062 python3.9[130671]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:17.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:18.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:18 np0005596062 python3.9[130824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:18 np0005596062 python3.9[130902]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:19 np0005596062 python3.9[131054]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:51:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:19.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:20 np0005596062 systemd[1]: Reloading.
Jan 26 12:51:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:20.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:20 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:51:20 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:51:20 np0005596062 systemd[1]: Starting Create netns directory...
Jan 26 12:51:20 np0005596062 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 12:51:20 np0005596062 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 12:51:20 np0005596062 systemd[1]: Finished Create netns directory.
Jan 26 12:51:21 np0005596062 python3.9[131249]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:51:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:21.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:22.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:22 np0005596062 python3.9[131401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:22 np0005596062 python3.9[131525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449881.604906-1368-75799666794258/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:51:23 np0005596062 python3.9[131677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:23.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:24.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:24 np0005596062 python3.9[131830]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:51:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:25 np0005596062 python3.9[131982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:25 np0005596062 python3.9[132105]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449884.8984363-1466-202021840596850/.source.json _original_basename=.acj21qk5 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:25.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:26.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:26 np0005596062 python3.9[132256]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:28.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:28.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:28 np0005596062 python3.9[132680]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 26 12:51:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:30.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:30 np0005596062 python3.9[132832]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 12:51:31 np0005596062 python3[132985]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 12:51:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:32.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:32.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:34.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:34.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:36.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:51:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:36.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:37 np0005596062 podman[132999]: 2026-01-26 17:51:37.299931117 +0000 UTC m=+5.248688282 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 26 12:51:37 np0005596062 podman[133140]: 2026-01-26 17:51:37.464548399 +0000 UTC m=+0.052002082 container create e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 12:51:37 np0005596062 podman[133140]: 2026-01-26 17:51:37.435175789 +0000 UTC m=+0.022629542 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 26 12:51:37 np0005596062 python3[132985]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 26 12:51:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:38.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:38.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:38 np0005596062 python3.9[133358]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:51:39 np0005596062 python3.9[133512]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:40.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:51:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:40.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:40 np0005596062 python3.9[133588]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:51:40 np0005596062 python3.9[133740]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769449900.1782556-1700-176955687368420/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:41 np0005596062 python3.9[133816]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 12:51:41 np0005596062 systemd[1]: Reloading.
Jan 26 12:51:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:42.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:42 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:51:42 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:51:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:42.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:42 np0005596062 python3.9[133928]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:51:43 np0005596062 systemd[1]: Reloading.
Jan 26 12:51:43 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:51:43 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:51:43 np0005596062 systemd[1]: Starting ovn_controller container...
Jan 26 12:51:43 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:51:43 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78614f56d8c47d44204ceb5a98eb7225087ec331efcb7933ea5d32c9156b8c1a/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 12:51:43 np0005596062 systemd[1]: Started /usr/bin/podman healthcheck run e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b.
Jan 26 12:51:43 np0005596062 podman[133969]: 2026-01-26 17:51:43.532932281 +0000 UTC m=+0.167374606 container init e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 12:51:43 np0005596062 ovn_controller[133984]: + sudo -E kolla_set_configs
Jan 26 12:51:43 np0005596062 podman[133969]: 2026-01-26 17:51:43.569756339 +0000 UTC m=+0.204198584 container start e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 12:51:43 np0005596062 edpm-start-podman-container[133969]: ovn_controller
Jan 26 12:51:43 np0005596062 systemd[1]: Created slice User Slice of UID 0.
Jan 26 12:51:43 np0005596062 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 26 12:51:43 np0005596062 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 26 12:51:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 12:51:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2150 writes, 12K keys, 2150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2150 writes, 2150 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2150 writes, 12K keys, 2150 commit groups, 1.0 writes per commit group, ingest: 23.52 MB, 0.04 MB/s#012Interval WAL: 2150 writes, 2150 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    111.5      0.10              0.05         4    0.026       0      0       0.0       0.0#012  L6      1/0    7.45 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.1     59.5     50.9      0.48              0.08         3    0.160     12K   1281       0.0       0.0#012 Sum      1/0    7.45 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.1     49.0     61.7      0.58              0.12         7    0.083     12K   1281       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.1     49.1     61.9      0.58              0.12         6    0.097     12K   1281       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     59.5     50.9      0.48              0.08         3    0.160     12K   1281       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    113.4      0.10              0.05         3    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.011, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.06 MB/s write, 0.03 GB read, 0.05 MB/s read, 0.6 seconds#012Interval compaction: 0.04 GB write, 0.06 MB/s write, 0.03 GB read, 0.05 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 956.62 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000118 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(45,823.25 KB,0.264459%) FilterBlock(7,41.30 KB,0.0132661%) IndexBlock(7,92.08 KB,0.029579%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 12:51:43 np0005596062 systemd[1]: Starting User Manager for UID 0...
Jan 26 12:51:43 np0005596062 edpm-start-podman-container[133968]: Creating additional drop-in dependency for "ovn_controller" (e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b)
Jan 26 12:51:43 np0005596062 systemd[1]: Reloading.
Jan 26 12:51:43 np0005596062 podman[133991]: 2026-01-26 17:51:43.700029609 +0000 UTC m=+0.109055508 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 12:51:43 np0005596062 systemd[134015]: Queued start job for default target Main User Target.
Jan 26 12:51:43 np0005596062 systemd[134015]: Created slice User Application Slice.
Jan 26 12:51:43 np0005596062 systemd[134015]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 26 12:51:43 np0005596062 systemd[134015]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 12:51:43 np0005596062 systemd[134015]: Reached target Paths.
Jan 26 12:51:43 np0005596062 systemd[134015]: Reached target Timers.
Jan 26 12:51:43 np0005596062 systemd[134015]: Starting D-Bus User Message Bus Socket...
Jan 26 12:51:43 np0005596062 systemd[134015]: Starting Create User's Volatile Files and Directories...
Jan 26 12:51:43 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:51:43 np0005596062 systemd[134015]: Finished Create User's Volatile Files and Directories.
Jan 26 12:51:43 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:51:43 np0005596062 systemd[134015]: Listening on D-Bus User Message Bus Socket.
Jan 26 12:51:43 np0005596062 systemd[134015]: Reached target Sockets.
Jan 26 12:51:43 np0005596062 systemd[134015]: Reached target Basic System.
Jan 26 12:51:43 np0005596062 systemd[134015]: Reached target Main User Target.
Jan 26 12:51:43 np0005596062 systemd[134015]: Startup finished in 134ms.
Jan 26 12:51:43 np0005596062 systemd[1]: Started User Manager for UID 0.
Jan 26 12:51:43 np0005596062 systemd[1]: Started ovn_controller container.
Jan 26 12:51:43 np0005596062 systemd[1]: e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b-1a6049a29356d165.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 12:51:43 np0005596062 systemd[1]: e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b-1a6049a29356d165.service: Failed with result 'exit-code'.
Jan 26 12:51:43 np0005596062 systemd[1]: Started Session c1 of User root.
Jan 26 12:51:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:44.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: INFO:__main__:Validating config file
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: INFO:__main__:Writing out command to execute
Jan 26 12:51:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:44.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:44 np0005596062 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: ++ cat /run_command
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + ARGS=
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + sudo kolla_copy_cacerts
Jan 26 12:51:44 np0005596062 systemd[1]: Started Session c2 of User root.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + [[ ! -n '' ]]
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + . kolla_extend_start
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + umask 0022
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 26 12:51:44 np0005596062 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.1144] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.1150] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 12:51:44 np0005596062 kernel: br-int: entered promiscuous mode
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <warn>  [1769449904.1153] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 12:51:44 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.1159] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.1163] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.1166] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 26 12:51:44 np0005596062 systemd-udevd[134120]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 12:51:44 np0005596062 ovn_controller[133984]: 2026-01-26T17:51:44Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.3309] manager: (ovn-345392-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.3319] manager: (ovn-657115-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.3326] manager: (ovn-c76f25-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 26 12:51:44 np0005596062 kernel: genev_sys_6081: entered promiscuous mode
Jan 26 12:51:44 np0005596062 systemd-udevd[134122]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.3595] device (genev_sys_6081): carrier: link connected
Jan 26 12:51:44 np0005596062 NetworkManager[48993]: <info>  [1769449904.3599] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Jan 26 12:51:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:45 np0005596062 python3.9[134251]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 12:51:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:51:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:46.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:51:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:46.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:46 np0005596062 python3.9[134404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:51:47 np0005596062 python3.9[134527]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449906.0532498-1836-8069130947822/.source.yaml _original_basename=.2jjf8ckl follow=False checksum=d2889da2b79efa07e6f3a0ac50ac42a16f618171 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:51:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:48.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:48.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:48 np0005596062 python3.9[134680]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:48 np0005596062 ovs-vsctl[134681]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 26 12:51:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:50.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:50.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:50 np0005596062 python3.9[134834]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:50 np0005596062 ovs-vsctl[134836]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 26 12:51:51 np0005596062 python3.9[134989]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:51:51 np0005596062 ovs-vsctl[134990]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 26 12:51:52 np0005596062 systemd[1]: session-45.scope: Deactivated successfully.
Jan 26 12:51:52 np0005596062 systemd[1]: session-45.scope: Consumed 1min 513ms CPU time.
Jan 26 12:51:52 np0005596062 systemd-logind[781]: Session 45 logged out. Waiting for processes to exit.
Jan 26 12:51:52 np0005596062 systemd-logind[781]: Removed session 45.
Jan 26 12:51:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:52.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:51:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:52.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:51:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:54.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:54.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:54 np0005596062 systemd[1]: Stopping User Manager for UID 0...
Jan 26 12:51:54 np0005596062 systemd[134015]: Activating special unit Exit the Session...
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped target Main User Target.
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped target Basic System.
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped target Paths.
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped target Sockets.
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped target Timers.
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 12:51:54 np0005596062 systemd[134015]: Closed D-Bus User Message Bus Socket.
Jan 26 12:51:54 np0005596062 systemd[134015]: Stopped Create User's Volatile Files and Directories.
Jan 26 12:51:54 np0005596062 systemd[134015]: Removed slice User Application Slice.
Jan 26 12:51:54 np0005596062 systemd[134015]: Reached target Shutdown.
Jan 26 12:51:54 np0005596062 systemd[134015]: Finished Exit the Session.
Jan 26 12:51:54 np0005596062 systemd[134015]: Reached target Exit the Session.
Jan 26 12:51:54 np0005596062 systemd[1]: user@0.service: Deactivated successfully.
Jan 26 12:51:54 np0005596062 systemd[1]: Stopped User Manager for UID 0.
Jan 26 12:51:54 np0005596062 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 26 12:51:54 np0005596062 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 26 12:51:54 np0005596062 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 26 12:51:54 np0005596062 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 26 12:51:54 np0005596062 systemd[1]: Removed slice User Slice of UID 0.
Jan 26 12:51:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:51:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:56.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:56.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:58 np0005596062 systemd-logind[781]: New session 47 of user zuul.
Jan 26 12:51:58 np0005596062 systemd[1]: Started Session 47 of User zuul.
Jan 26 12:51:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:51:58.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:51:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:51:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:51:58.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:51:59 np0005596062 python3.9[135344]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:51:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:00.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:00.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:01 np0005596062 python3.9[135501]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:02.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:02.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.450930) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449922451016, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1944, "num_deletes": 251, "total_data_size": 4946922, "memory_usage": 4996560, "flush_reason": "Manual Compaction"}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 26 12:52:02 np0005596062 python3.9[135654]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449922644283, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3243652, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10481, "largest_seqno": 12420, "table_properties": {"data_size": 3235591, "index_size": 4940, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 15558, "raw_average_key_size": 19, "raw_value_size": 3219785, "raw_average_value_size": 4044, "num_data_blocks": 221, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449721, "oldest_key_time": 1769449721, "file_creation_time": 1769449922, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 193397 microseconds, and 7796 cpu microseconds.
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.644337) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3243652 bytes OK
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.644359) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.647024) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.647054) EVENT_LOG_v1 {"time_micros": 1769449922647045, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.647077) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4938427, prev total WAL file size 4955691, number of live WAL files 2.
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.649637) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(3167KB)], [21(7628KB)]
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449922649763, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 11055564, "oldest_snapshot_seqno": -1}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4014 keys, 8812575 bytes, temperature: kUnknown
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449922824991, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8812575, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8782465, "index_size": 18992, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 97504, "raw_average_key_size": 24, "raw_value_size": 8706704, "raw_average_value_size": 2169, "num_data_blocks": 820, "num_entries": 4014, "num_filter_entries": 4014, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769449922, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.825400) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8812575 bytes
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.828503) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 63.1 rd, 50.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 7.5 +0.0 blob) out(8.4 +0.0 blob), read-write-amplify(6.1) write-amplify(2.7) OK, records in: 4533, records dropped: 519 output_compression: NoCompression
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.828575) EVENT_LOG_v1 {"time_micros": 1769449922828524, "job": 10, "event": "compaction_finished", "compaction_time_micros": 175328, "compaction_time_cpu_micros": 48351, "output_level": 6, "num_output_files": 1, "total_output_size": 8812575, "num_input_records": 4533, "num_output_records": 4014, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449922829791, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769449922832593, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.649432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.832927) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.832955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.832958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.832962) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:52:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:52:02.832965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:52:03 np0005596062 python3.9[135907]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:04 np0005596062 python3.9[136090]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:04.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:04.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 12:52:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 12:52:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:52:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 26 12:52:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:52:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:04 np0005596062 python3.9[136245]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:05 np0005596062 python3.9[136395]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:52:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:06.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:06.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:52:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:52:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:52:07 np0005596062 python3.9[136548]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 12:52:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:08.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:08.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:09 np0005596062 python3.9[136699]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:10.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:10.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:10 np0005596062 python3.9[136820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449928.832705-221-142487159221395/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:11 np0005596062 python3.9[136971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:11 np0005596062 python3.9[137092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449930.462334-266-105933531078405/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:12.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:12.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:12 np0005596062 python3.9[137245]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:52:13 np0005596062 python3.9[137329]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:52:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:52:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:52:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:14.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:14.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:14 np0005596062 ovn_controller[133984]: 2026-01-26T17:52:14Z|00025|memory|INFO|15872 kB peak resident set size after 30.8 seconds
Jan 26 12:52:14 np0005596062 ovn_controller[133984]: 2026-01-26T17:52:14Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 26 12:52:14 np0005596062 podman[137382]: 2026-01-26 17:52:14.916387948 +0000 UTC m=+0.112838368 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 12:52:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 12:52:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5286 writes, 23K keys, 5286 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5286 writes, 797 syncs, 6.63 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5286 writes, 23K keys, 5286 commit groups, 1.0 writes per commit group, ingest: 18.59 MB, 0.03 MB/s#012Interval WAL: 5286 writes, 797 syncs, 6.63 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5647a9935350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5647a9935350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 26 12:52:16 np0005596062 python3.9[137561]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:52:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:16.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:16.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:18.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:18.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:18 np0005596062 python3.9[137766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:19 np0005596062 python3.9[137887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449937.374198-377-58887598671605/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:19 np0005596062 python3.9[138037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa1851fd6f0 =====
Jan 26 12:52:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa1851fd6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:20 np0005596062 radosgw[83289]: beast: 0x7fa1851fd6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:20.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:20.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:20 np0005596062 python3.9[138159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449939.247318-377-121958228060683/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:21 np0005596062 python3.9[138309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:22.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:22.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:22 np0005596062 python3.9[138431]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449941.2625258-508-36058677562037/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:22 np0005596062 python3.9[138581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:23 np0005596062 python3.9[138702]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449942.5018375-508-182537559213820/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:24.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:24.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:24 np0005596062 python3.9[138853]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:52:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:25 np0005596062 python3.9[139007]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:26.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:26.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:26 np0005596062 python3.9[139159]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:26 np0005596062 python3.9[139238]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:27 np0005596062 python3.9[139390]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:27 np0005596062 python3.9[139468]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:28.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:28.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:29 np0005596062 python3.9[139621]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:30.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:30.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:30 np0005596062 python3.9[139774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:30 np0005596062 python3.9[139852]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:31 np0005596062 python3.9[140004]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:31 np0005596062 python3.9[140082]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:32.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:32.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:32 np0005596062 python3.9[140235]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:52:32 np0005596062 systemd[1]: Reloading.
Jan 26 12:52:32 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:52:32 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:52:34 np0005596062 python3.9[140425]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:34.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:34.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:34 np0005596062 python3.9[140504]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:35 np0005596062 python3.9[140656]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:35 np0005596062 python3.9[140734]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:36.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:36.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:36 np0005596062 python3.9[140887]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:52:36 np0005596062 systemd[1]: Reloading.
Jan 26 12:52:36 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:52:36 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:52:36 np0005596062 systemd[1]: Starting Create netns directory...
Jan 26 12:52:36 np0005596062 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 12:52:36 np0005596062 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 12:52:36 np0005596062 systemd[1]: Finished Create netns directory.
Jan 26 12:52:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:38.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:38.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:38 np0005596062 python3.9[141131]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:39 np0005596062 python3.9[141283]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:39 np0005596062 python3.9[141406]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769449958.485271-962-66483357415056/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:40.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:40.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:40 np0005596062 python3.9[141559]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:41 np0005596062 python3.9[141711]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:52:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:42.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:42.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:42 np0005596062 python3.9[141864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:52:43 np0005596062 python3.9[141987]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449961.9886096-1060-121265824628553/.source.json _original_basename=.u6krr4r6 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:44 np0005596062 python3.9[142137]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:52:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:44.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:44.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:45 np0005596062 podman[142385]: 2026-01-26 17:52:45.289333866 +0000 UTC m=+0.103514258 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 12:52:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:46.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:46.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:46 np0005596062 python3.9[142590]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 26 12:52:47 np0005596062 python3.9[142742]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 12:52:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:48.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:48.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:49 np0005596062 python3[142895]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 12:52:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:50.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:52:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:50.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:52:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:52.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:52.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 26 12:52:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 26 12:52:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:54.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:54.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:52:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:56.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:56.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:52:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:52:58.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:52:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:52:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:52:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:52:58.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:52:59 np0005596062 ceph-mds[83671]: mds.beacon.cephfs.compute-2.oqvedy missed beacon ack from the monitors
Jan 26 12:52:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:00 np0005596062 podman[142909]: 2026-01-26 17:53:00.04681398 +0000 UTC m=+10.618702499 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 12:53:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:00.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:00.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:00 np0005596062 podman[143107]: 2026-01-26 17:53:00.275833355 +0000 UTC m=+0.080443028 container create db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 12:53:00 np0005596062 podman[143107]: 2026-01-26 17:53:00.236417683 +0000 UTC m=+0.041027406 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 12:53:00 np0005596062 python3[142895]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 12:53:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:02.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:02.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:02 np0005596062 python3.9[143298]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:53:03 np0005596062 python3.9[143452]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:04.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:04.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:04 np0005596062 python3.9[143529]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:53:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:05 np0005596062 python3.9[143680]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769449984.3960183-1294-146340765221594/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:05 np0005596062 python3.9[143756]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 12:53:05 np0005596062 systemd[1]: Reloading.
Jan 26 12:53:05 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:53:05 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:53:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:06.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:06.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:06 np0005596062 python3.9[143867]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:06 np0005596062 systemd[1]: Reloading.
Jan 26 12:53:06 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:53:06 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:53:06 np0005596062 systemd[1]: Starting ovn_metadata_agent container...
Jan 26 12:53:07 np0005596062 systemd[1]: Started libcrun container.
Jan 26 12:53:07 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16634149ba5e0c93e3d0c7e1cb79b5dda2b584eceb56c42e8551a4945d3be4c0/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 26 12:53:07 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16634149ba5e0c93e3d0c7e1cb79b5dda2b584eceb56c42e8551a4945d3be4c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 12:53:07 np0005596062 systemd[1]: Started /usr/bin/podman healthcheck run db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e.
Jan 26 12:53:07 np0005596062 podman[143908]: 2026-01-26 17:53:07.293924296 +0000 UTC m=+0.282305315 container init db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + sudo -E kolla_set_configs
Jan 26 12:53:07 np0005596062 podman[143908]: 2026-01-26 17:53:07.326853047 +0000 UTC m=+0.315234046 container start db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:53:07 np0005596062 edpm-start-podman-container[143908]: ovn_metadata_agent
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Validating config file
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Copying service configuration files
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Writing out command to execute
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: ++ cat /run_command
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + CMD=neutron-ovn-metadata-agent
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + ARGS=
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + sudo kolla_copy_cacerts
Jan 26 12:53:07 np0005596062 edpm-start-podman-container[143907]: Creating additional drop-in dependency for "ovn_metadata_agent" (db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e)
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + [[ ! -n '' ]]
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + . kolla_extend_start
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: Running command: 'neutron-ovn-metadata-agent'
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + umask 0022
Jan 26 12:53:07 np0005596062 ovn_metadata_agent[143924]: + exec neutron-ovn-metadata-agent
Jan 26 12:53:07 np0005596062 systemd[1]: Reloading.
Jan 26 12:53:07 np0005596062 podman[143930]: 2026-01-26 17:53:07.466712305 +0000 UTC m=+0.118652359 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 12:53:07 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:53:07 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:53:07 np0005596062 systemd[1]: Started ovn_metadata_agent container.
Jan 26 12:53:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:08.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:08.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.107 143929 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.108 143929 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.108 143929 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.108 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.108 143929 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.108 143929 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.109 143929 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.110 143929 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.111 143929 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.112 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.113 143929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.114 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.115 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.116 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.117 143929 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.118 143929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.119 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.120 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.121 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.122 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.123 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.124 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.125 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.126 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.127 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.128 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.129 143929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.130 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.131 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.132 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.133 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.134 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.135 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.136 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.137 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.138 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.139 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.139 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.139 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.139 143929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.139 143929 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.147 143929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.148 143929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.148 143929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.148 143929 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.148 143929 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.161 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 9838f21e-c1ce-4cfa-829e-a12b9d657d8a (UUID: 9838f21e-c1ce-4cfa-829e-a12b9d657d8a) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.200 143929 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.200 143929 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.200 143929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.200 143929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.204 143929 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.211 143929 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.217 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '9838f21e-c1ce-4cfa-829e-a12b9d657d8a'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], external_ids={}, name=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, nb_cfg_timestamp=1769449912135, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.218 143929 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f748f9a9f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.219 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.219 143929 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.219 143929 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.219 143929 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.223 143929 DEBUG oslo_service.service [-] Started child 144035 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.226 143929 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpd_mqwdgo/privsep.sock']#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.230 144035 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-364255'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.266 144035 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.266 144035 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.267 144035 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.271 144035 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.278 144035 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.285 144035 INFO eventlet.wsgi.server [-] (144035) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 26 12:53:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:09 np0005596062 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.902 143929 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.903 143929 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpd_mqwdgo/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.788 144040 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.798 144040 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.805 144040 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.805 144040 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144040#033[00m
Jan 26 12:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:09.907 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[32228ed1-1e2a-4a5b-b917-0fee42de48b0]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 12:53:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:10.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:53:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:10.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.377 144040 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.378 144040 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.378 144040 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.869 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[96174f24-3c8c-47ed-9a65-a08a6eb74a8f]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.873 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, column=external_ids, values=({'neutron:ovn-metadata-id': '325451db-cad0-5b90-96af-90e468a83a24'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.972 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.998 143929 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.998 143929 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.999 143929 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.999 143929 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 12:53:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.999 143929 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.999 143929 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.999 143929 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:10.999 143929 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.000 143929 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.000 143929 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.000 143929 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.000 143929 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.000 143929 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.000 143929 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.001 143929 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.002 143929 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.002 143929 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.002 143929 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.002 143929 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.002 143929 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.003 143929 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.003 143929 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.003 143929 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.003 143929 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.003 143929 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.004 143929 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.004 143929 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.004 143929 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.004 143929 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.005 143929 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.005 143929 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.005 143929 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.006 143929 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.006 143929 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.006 143929 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.006 143929 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.007 143929 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.007 143929 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.007 143929 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.007 143929 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.008 143929 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.008 143929 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.008 143929 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.008 143929 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.009 143929 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.009 143929 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.009 143929 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.009 143929 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.009 143929 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.010 143929 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.010 143929 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.010 143929 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.010 143929 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.011 143929 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.011 143929 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.011 143929 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.011 143929 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.011 143929 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.012 143929 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.012 143929 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.012 143929 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.012 143929 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.013 143929 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.013 143929 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.013 143929 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.013 143929 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.014 143929 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.014 143929 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.014 143929 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.014 143929 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.014 143929 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.015 143929 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.015 143929 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.015 143929 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.015 143929 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.015 143929 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.016 143929 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.016 143929 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.016 143929 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.016 143929 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.017 143929 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.017 143929 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.017 143929 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.017 143929 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.018 143929 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.018 143929 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.018 143929 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.018 143929 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.018 143929 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.019 143929 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.019 143929 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.019 143929 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.019 143929 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.019 143929 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.020 143929 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.020 143929 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.020 143929 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.021 143929 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.021 143929 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.021 143929 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.021 143929 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.021 143929 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.022 143929 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.022 143929 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.022 143929 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.022 143929 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.022 143929 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.022 143929 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.023 143929 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.023 143929 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.023 143929 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.023 143929 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.023 143929 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.023 143929 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.024 143929 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.024 143929 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.024 143929 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.024 143929 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.024 143929 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.024 143929 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.025 143929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.026 143929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.026 143929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.026 143929 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.026 143929 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.026 143929 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.026 143929 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.027 143929 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.028 143929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.029 143929 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.030 143929 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.031 143929 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.032 143929 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.033 143929 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.033 143929 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.033 143929 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.033 143929 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.033 143929 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.033 143929 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.034 143929 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.035 143929 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.036 143929 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.037 143929 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.038 143929 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.039 143929 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.040 143929 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.041 143929 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.042 143929 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.043 143929 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.044 143929 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.045 143929 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.045 143929 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.045 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.045 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.045 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.045 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.046 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.047 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.048 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.049 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.049 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.049 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.049 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.049 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.049 143929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.050 143929 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.050 143929 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.050 143929 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.050 143929 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 12:53:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:53:11.050 143929 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 12:53:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:12.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:12.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:13 np0005596062 python3.9[144245]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 12:53:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:14.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:53:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:14.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:53:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:15 np0005596062 python3.9[144457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:53:15 np0005596062 podman[144458]: 2026-01-26 17:53:15.578016388 +0000 UTC m=+0.146745611 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:53:15 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 12:53:15 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:53:15 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:53:15 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:53:16 np0005596062 python3.9[144609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769449994.5987215-1429-88054553141080/.source.yaml _original_basename=.r07x61to follow=False checksum=2750f4fe1239f577d6b91723132df62bfa8e4395 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:16.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:16.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:16 np0005596062 systemd[1]: session-47.scope: Deactivated successfully.
Jan 26 12:53:16 np0005596062 systemd[1]: session-47.scope: Consumed 1min 1.974s CPU time.
Jan 26 12:53:16 np0005596062 systemd-logind[781]: Session 47 logged out. Waiting for processes to exit.
Jan 26 12:53:16 np0005596062 systemd-logind[781]: Removed session 47.
Jan 26 12:53:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:18.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:18.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:20.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:20.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:21 np0005596062 systemd-logind[781]: New session 48 of user zuul.
Jan 26 12:53:21 np0005596062 systemd[1]: Started Session 48 of User zuul.
Jan 26 12:53:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:53:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:22.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:53:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:22.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:22 np0005596062 python3.9[144841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:53:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:24.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:24.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:24 np0005596062 python3.9[144998]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:26.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:26.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:53:26 np0005596062 python3.9[145164]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 12:53:26 np0005596062 systemd[1]: Reloading.
Jan 26 12:53:27 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:53:27 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:53:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:53:28 np0005596062 python3.9[145399]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:53:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:28.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:28.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:28 np0005596062 network[145417]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:53:28 np0005596062 network[145418]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:53:28 np0005596062 network[145419]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:53:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:30.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:30.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:32.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:53:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:32.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:53:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:34.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:34.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:35 np0005596062 python3.9[145684]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:36 np0005596062 python3.9[145837]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:36.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:36.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:36 np0005596062 python3.9[145991]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:37 np0005596062 python3.9[146144]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:37 np0005596062 podman[146147]: 2026-01-26 17:53:37.863628269 +0000 UTC m=+0.067477325 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 12:53:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:38.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:38.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:38 np0005596062 python3.9[146318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:39 np0005596062 python3.9[146521]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:40.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:53:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:40.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:53:41 np0005596062 python3.9[146675]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:53:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:42.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:42.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:42 np0005596062 python3.9[146829]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:43 np0005596062 python3.9[146981]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:43 np0005596062 python3.9[147133]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 12:53:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:44.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 12:53:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:44.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:44 np0005596062 python3.9[147286]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:45 np0005596062 python3.9[147438]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:45 np0005596062 podman[147562]: 2026-01-26 17:53:45.763658845 +0000 UTC m=+0.107765897 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Jan 26 12:53:45 np0005596062 python3.9[147611]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:46.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:46.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:46 np0005596062 python3.9[147768]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:47 np0005596062 python3.9[147920]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:48 np0005596062 python3.9[148072]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:48.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:48.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:48 np0005596062 python3.9[148225]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:49 np0005596062 python3.9[148377]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:49 np0005596062 python3.9[148529]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:50.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:50.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:50 np0005596062 python3.9[148682]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:51 np0005596062 python3.9[148834]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:53:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:52.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:52.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:52 np0005596062 python3.9[148987]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:53 np0005596062 python3.9[149139]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 12:53:54 np0005596062 python3.9[149291]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 12:53:54 np0005596062 systemd[1]: Reloading.
Jan 26 12:53:54 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:53:54 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:53:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:53:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:54.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:53:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:53:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:54.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:53:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:53:55 np0005596062 python3.9[149480]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:55 np0005596062 python3.9[149633]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:56.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:56.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:56 np0005596062 python3.9[149787]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:57 np0005596062 python3.9[149940]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:57 np0005596062 python3.9[150093]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:53:58.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:53:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:53:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:53:58.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:53:58 np0005596062 python3.9[150247]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:59 np0005596062 python3.9[150450]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:53:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:00.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:00.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:01 np0005596062 python3.9[150604]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 26 12:54:02 np0005596062 python3.9[150757]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 12:54:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:02.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:02.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:03 np0005596062 python3.9[150916]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 12:54:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:04.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:04.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:06 np0005596062 python3.9[151077]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:54:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:06.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:06.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:06 np0005596062 python3.9[151162]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:54:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:08.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:08.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:08 np0005596062 podman[151168]: 2026-01-26 17:54:08.886308705 +0000 UTC m=+0.082190617 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 12:54:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:54:09.140 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:54:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:54:09.141 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:54:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:54:09.141 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:54:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:10.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:10.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:12.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:12.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:14.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:14.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:16.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:16 np0005596062 podman[151198]: 2026-01-26 17:54:16.915480915 +0000 UTC m=+0.127388318 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 12:54:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:18.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:18.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:20.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:20.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:22.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:22.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:24.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:24.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:26.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:26.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:28.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:28.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:54:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:54:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:54:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:30.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:30.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:32.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:32.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:34.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:34.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:36.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:38.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:38.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:39 np0005596062 podman[151617]: 2026-01-26 17:54:39.160902256 +0000 UTC m=+0.076375977 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 12:54:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:40.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:40.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:54:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:54:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:54:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:42.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:54:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:42.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:44.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:44 np0005596062 kernel: SELinux:  Converting 2777 SID table entries...
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:54:44 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:54:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:46.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:46.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:47 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 26 12:54:47 np0005596062 podman[151725]: 2026-01-26 17:54:47.930405658 +0000 UTC m=+0.116865570 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 12:54:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:48.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:54:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:48.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:54:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:50.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:54:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:52.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:54:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:52.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:53 np0005596062 kernel: SELinux:  Converting 2777 SID table entries...
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:54:53 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:54:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:54.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:54:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:54.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:54:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:56.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:56.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:54:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:54:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:54:58.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:54:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:54:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:54:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:54:58.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:54:59 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 26 12:55:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:00.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:00.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:02.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:02.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:04.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:04.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:06.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:06.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:08.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:55:09.142 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:55:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:55:09.143 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:55:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:55:09.143 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:55:09 np0005596062 podman[153743]: 2026-01-26 17:55:09.878681149 +0000 UTC m=+0.072926234 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 12:55:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:55:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:10.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:55:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:55:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:10.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:55:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:12.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:12.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:14.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:14.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:16.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:16.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:18.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:18.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:18 np0005596062 podman[158783]: 2026-01-26 17:55:18.896598286 +0000 UTC m=+0.100483274 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 12:55:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:20.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:20.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:55:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:22.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:55:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:22.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:24.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:24.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:26.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:26.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:28.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:28.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:55:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:30.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:55:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:55:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:30.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:55:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 12:55:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:32.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 12:55:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:32.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:34.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:55:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:34.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:55:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:36.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:36.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:38.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:38.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:40.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:40 np0005596062 podman[168876]: 2026-01-26 17:55:40.340946272 +0000 UTC m=+0.061757965 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 12:55:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:40.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:55:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:55:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:55:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:42.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:42.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:55:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:44.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:55:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 12:55:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:46.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 12:55:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:55:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:46.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:55:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:48.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:48.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:49 np0005596062 kernel: SELinux:  Converting 2778 SID table entries...
Jan 26 12:55:49 np0005596062 podman[169011]: 2026-01-26 17:55:49.903768074 +0000 UTC m=+0.100756451 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability open_perms=1
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability always_check_network=0
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 12:55:49 np0005596062 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 12:55:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:50.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:51 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:55:51 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 26 12:55:51 np0005596062 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Jan 26 12:55:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:55:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:52.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:55:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:55:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:54.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:55:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:54.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:54 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:55:54 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:55:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:56.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:56.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:55:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:55:58.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:55:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:55:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:55:58.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:55:59 np0005596062 systemd[1]: Stopping OpenSSH server daemon...
Jan 26 12:55:59 np0005596062 systemd[1]: sshd.service: Deactivated successfully.
Jan 26 12:55:59 np0005596062 systemd[1]: Stopped OpenSSH server daemon.
Jan 26 12:55:59 np0005596062 systemd[1]: sshd.service: Consumed 3.277s CPU time, read 564.0K from disk, written 8.0K to disk.
Jan 26 12:55:59 np0005596062 systemd[1]: Stopped target sshd-keygen.target.
Jan 26 12:55:59 np0005596062 systemd[1]: Stopping sshd-keygen.target...
Jan 26 12:55:59 np0005596062 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 12:55:59 np0005596062 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 12:55:59 np0005596062 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 12:55:59 np0005596062 systemd[1]: Reached target sshd-keygen.target.
Jan 26 12:55:59 np0005596062 systemd[1]: Starting OpenSSH server daemon...
Jan 26 12:55:59 np0005596062 systemd[1]: Started OpenSSH server daemon.
Jan 26 12:56:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:00.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:01 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:56:01 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:56:02 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:02 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:02 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:02.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:02 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:56:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:02.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:04.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:04.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:06.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:06.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:08.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:08.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:56:09.143 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:56:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:56:09.144 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:56:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:56:09.144 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:56:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:10.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:10.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:10 np0005596062 podman[178678]: 2026-01-26 17:56:10.892444731 +0000 UTC m=+0.067916538 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 26 12:56:11 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:56:11 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:56:11 np0005596062 systemd[1]: man-db-cache-update.service: Consumed 10.816s CPU time.
Jan 26 12:56:11 np0005596062 systemd[1]: run-r1d00fc6c79994ae8893f0415e0920000.service: Deactivated successfully.
Jan 26 12:56:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:12.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:12.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:13 np0005596062 python3.9[178826]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:56:13 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:13 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:13 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:56:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:14.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:56:14 np0005596062 python3.9[179017]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:56:14 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:14.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:14 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:14 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:15 np0005596062 python3.9[179207]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:56:15 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:15 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:15 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:16.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:16.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:16 np0005596062 python3.9[179399]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:56:16 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:17 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:17 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:18.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:18.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:19 np0005596062 python3.9[179591]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:19 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:19 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:19 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:20 np0005596062 podman[179778]: 2026-01-26 17:56:20.155814407 +0000 UTC m=+0.128787388 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 12:56:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:20.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:20 np0005596062 python3.9[179847]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:56:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:20.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:56:20 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:20 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:20 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:21 np0005596062 python3.9[180045]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:21 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:21 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:21 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:22.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:22.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:22 np0005596062 python3.9[180237]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:23 np0005596062 python3.9[180392]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:23 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:23 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:23 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:24.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:24.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:25 np0005596062 python3.9[180583]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 12:56:25 np0005596062 systemd[1]: Reloading.
Jan 26 12:56:25 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:56:25 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:56:25 np0005596062 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 26 12:56:25 np0005596062 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 26 12:56:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:26.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:26.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:26 np0005596062 python3.9[180777]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:27 np0005596062 python3.9[180932]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:28.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:56:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:28.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:56:28 np0005596062 python3.9[181088]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:30.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:30 np0005596062 python3.9[181244]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:30.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:31 np0005596062 python3.9[181399]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:32 np0005596062 python3.9[181554]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:32.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:33 np0005596062 python3.9[181710]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:33 np0005596062 python3.9[181865]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:34.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:34.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:35 np0005596062 python3.9[182021]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:36.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:36.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:36 np0005596062 python3.9[182177]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:37 np0005596062 python3.9[182332]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:38.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:38.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:38 np0005596062 python3.9[182488]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:39 np0005596062 python3.9[182643]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:56:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:40.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:56:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:40.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:40 np0005596062 python3.9[182837]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 12:56:41 np0005596062 podman[182976]: 2026-01-26 17:56:41.496034192 +0000 UTC m=+0.086089718 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 12:56:41 np0005596062 python3.9[183018]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:56:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:42.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:42 np0005596062 python3.9[183174]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:56:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:42.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:43 np0005596062 python3.9[183326]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:56:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:44.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:44 np0005596062 python3.9[183479]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:56:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:44.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:44 np0005596062 python3.9[183631]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:56:46 np0005596062 python3.9[183783]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 12:56:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:46.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:46.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:46 np0005596062 python3.9[183934]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:56:47 np0005596062 python3.9[184086]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:48.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:48 np0005596062 python3.9[184212]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450207.190644-1649-70658424199375/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:56:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:48.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:56:49 np0005596062 python3.9[184364]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:49 np0005596062 python3.9[184489]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450208.6446059-1649-101329970848092/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:50.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:50 np0005596062 podman[184614]: 2026-01-26 17:56:50.359934797 +0000 UTC m=+0.095634620 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 12:56:50 np0005596062 python3.9[184662]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:50.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:51 np0005596062 python3.9[184793]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450209.9528675-1649-154754008166491/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:51 np0005596062 python3.9[184945]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:52.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:52 np0005596062 python3.9[185071]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450211.3316433-1649-217818607332050/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:52.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:53 np0005596062 python3.9[185223]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:53 np0005596062 python3.9[185348]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450212.6111639-1649-246855391176299/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:54.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:54 np0005596062 python3.9[185501]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:54.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:55 np0005596062 python3.9[185631]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450213.8521762-1649-149456599644688/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:55 np0005596062 python3.9[185895]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:56 np0005596062 python3.9[186033]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450215.222615-1649-241780971238104/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:56:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:56.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:56:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:56 np0005596062 python3.9[186185]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:56:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:56:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:56:57 np0005596062 python3.9[186310]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769450216.4195051-1649-78557033268508/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:56:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:56:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:56:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:56:58.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:56:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:56:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:56:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:56:58.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:56:58 np0005596062 python3.9[186463]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 26 12:56:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:56:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:56:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:56:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:56:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:56:59 np0005596062 python3.9[186616]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:00.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:00 np0005596062 python3.9[186817]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:01 np0005596062 python3.9[186971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:01 np0005596062 python3.9[187123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.240243) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450222240418, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 3104, "num_deletes": 503, "total_data_size": 7519216, "memory_usage": 7597712, "flush_reason": "Manual Compaction"}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450222275130, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4918671, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12425, "largest_seqno": 15524, "table_properties": {"data_size": 4906643, "index_size": 7562, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 25257, "raw_average_key_size": 18, "raw_value_size": 4880985, "raw_average_value_size": 3623, "num_data_blocks": 337, "num_entries": 1347, "num_filter_entries": 1347, "num_deletions": 503, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449922, "oldest_key_time": 1769449922, "file_creation_time": 1769450222, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 34932 microseconds, and 10909 cpu microseconds.
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.275272) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4918671 bytes OK
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.275315) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.358312) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.358353) EVENT_LOG_v1 {"time_micros": 1769450222358345, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.358373) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 7505121, prev total WAL file size 7505121, number of live WAL files 2.
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.360391) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4803KB)], [24(8606KB)]
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450222360472, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 13731246, "oldest_snapshot_seqno": -1}
Jan 26 12:57:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:02.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4336 keys, 11262860 bytes, temperature: kUnknown
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450222492548, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 11262860, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11228103, "index_size": 22807, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 106576, "raw_average_key_size": 24, "raw_value_size": 11144031, "raw_average_value_size": 2570, "num_data_blocks": 965, "num_entries": 4336, "num_filter_entries": 4336, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450222, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.492857) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 11262860 bytes
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.494579) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.9 rd, 85.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.7, 8.4 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(5.1) write-amplify(2.3) OK, records in: 5361, records dropped: 1025 output_compression: NoCompression
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.494600) EVENT_LOG_v1 {"time_micros": 1769450222494590, "job": 12, "event": "compaction_finished", "compaction_time_micros": 132189, "compaction_time_cpu_micros": 36366, "output_level": 6, "num_output_files": 1, "total_output_size": 11262860, "num_input_records": 5361, "num_output_records": 4336, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450222495818, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450222497559, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.360110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.497659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.497663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.497665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.497667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:02 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:02.497669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:02.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:02 np0005596062 python3.9[187276]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:03 np0005596062 python3.9[187428]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:04 np0005596062 python3.9[187580]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:04.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:04.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:04 np0005596062 python3.9[187733]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:05 np0005596062 python3.9[187885]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:06 np0005596062 python3.9[188038]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:06.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:06.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:06 np0005596062 python3.9[188190]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:07 np0005596062 python3.9[188342]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:08.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:08 np0005596062 python3.9[188495]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:08.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:57:09.144 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:57:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:57:09.146 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:57:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:57:09.147 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:57:09 np0005596062 python3.9[188647]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:09 np0005596062 python3.9[188799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:10.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:10.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:10 np0005596062 python3.9[188923]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450229.466167-2312-107190025647144/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:11 np0005596062 python3.9[189075]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:11 np0005596062 podman[189170]: 2026-01-26 17:57:11.727965326 +0000 UTC m=+0.084496866 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 26 12:57:11 np0005596062 python3.9[189213]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450230.7994738-2312-260747713323670/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:12.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:12.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:12 np0005596062 python3.9[189371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:13 np0005596062 python3.9[189494]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450232.1069577-2312-112680255304797/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:13 np0005596062 python3.9[189696]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:57:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:57:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:14.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:14 np0005596062 python3.9[189820]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450233.3308966-2312-92177069839582/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:14.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:15 np0005596062 python3.9[189972]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:15 np0005596062 python3.9[190095]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450234.6722736-2312-29591771865939/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:16.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:16 np0005596062 ceph-mgr[77538]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2716354406
Jan 26 12:57:16 np0005596062 python3.9[190248]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:17 np0005596062 python3.9[190371]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450236.0343165-2312-242645948098506/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:17 np0005596062 python3.9[190523]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:18 np0005596062 python3.9[190647]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450237.2818692-2312-104497772334614/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:18.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:18.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:19 np0005596062 python3.9[190799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:19 np0005596062 python3.9[190922]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450238.539523-2312-130705185701073/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:20 np0005596062 python3.9[191075]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:20.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:20.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:20 np0005596062 podman[191193]: 2026-01-26 17:57:20.695300316 +0000 UTC m=+0.136992995 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible)
Jan 26 12:57:20 np0005596062 python3.9[191262]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450239.764176-2312-116298502429175/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:21 np0005596062 python3.9[191425]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:22 np0005596062 python3.9[191548]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450241.000378-2312-97481922805361/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:22.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:22.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:22 np0005596062 python3.9[191701]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:23 np0005596062 python3.9[191824]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450242.2428517-2312-117535459106286/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:23 np0005596062 python3.9[191976]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:24 np0005596062 python3.9[192100]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450243.3818598-2312-25970410275465/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:24.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:24.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:24 np0005596062 python3.9[192252]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:25 np0005596062 python3.9[192375]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450244.5276341-2312-40218851441561/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:26.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:26 np0005596062 python3.9[192527]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:26.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:27 np0005596062 python3.9[192651]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450245.6373537-2312-151385178249927/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.423667) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450247423774, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 486, "num_deletes": 252, "total_data_size": 719307, "memory_usage": 729224, "flush_reason": "Manual Compaction"}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450247469085, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 367672, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15529, "largest_seqno": 16010, "table_properties": {"data_size": 365175, "index_size": 597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6471, "raw_average_key_size": 19, "raw_value_size": 360063, "raw_average_value_size": 1094, "num_data_blocks": 26, "num_entries": 329, "num_filter_entries": 329, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450222, "oldest_key_time": 1769450222, "file_creation_time": 1769450247, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 45481 microseconds, and 3246 cpu microseconds.
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.469145) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 367672 bytes OK
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.469168) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.474182) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.474221) EVENT_LOG_v1 {"time_micros": 1769450247474212, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.474241) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 716385, prev total WAL file size 716385, number of live WAL files 2.
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.474793) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353033' seq:0, type:0; will stop at (end)
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(359KB)], [27(10MB)]
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450247474830, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 11630532, "oldest_snapshot_seqno": -1}
Jan 26 12:57:27 np0005596062 python3.9[192801]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4156 keys, 7872607 bytes, temperature: kUnknown
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450247940462, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 7872607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7843373, "index_size": 17732, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10437, "raw_key_size": 103227, "raw_average_key_size": 24, "raw_value_size": 7766719, "raw_average_value_size": 1868, "num_data_blocks": 744, "num_entries": 4156, "num_filter_entries": 4156, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450247, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.940759) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 7872607 bytes
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.943252) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 25.0 rd, 16.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.7 +0.0 blob) out(7.5 +0.0 blob), read-write-amplify(53.0) write-amplify(21.4) OK, records in: 4665, records dropped: 509 output_compression: NoCompression
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.943295) EVENT_LOG_v1 {"time_micros": 1769450247943279, "job": 14, "event": "compaction_finished", "compaction_time_micros": 465739, "compaction_time_cpu_micros": 19227, "output_level": 6, "num_output_files": 1, "total_output_size": 7872607, "num_input_records": 4665, "num_output_records": 4156, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450247943574, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450247946921, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.474633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.947025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.947032) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.947034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.947036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-17:57:27.947038) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 12:57:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:28.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:28.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:28 np0005596062 python3.9[192957]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 26 12:57:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:30.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:30.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:31 np0005596062 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 26 12:57:32 np0005596062 python3.9[193114]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:32.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:32.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:32 np0005596062 python3.9[193267]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:33 np0005596062 auditd[701]: Audit daemon rotating log files
Jan 26 12:57:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:33 np0005596062 python3.9[193419]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:33 np0005596062 python3.9[193571]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:34.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:34.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:34 np0005596062 python3.9[193724]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:35 np0005596062 python3.9[193876]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:36.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:36.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:36 np0005596062 python3.9[194029]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:37 np0005596062 python3.9[194181]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:37 np0005596062 python3.9[194333]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:38.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:38.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:38 np0005596062 python3.9[194486]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:39 np0005596062 python3.9[194638]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:57:39 np0005596062 systemd[1]: Reloading.
Jan 26 12:57:39 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:57:39 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:57:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:40.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:40.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:40 np0005596062 systemd[1]: Starting libvirt logging daemon socket...
Jan 26 12:57:40 np0005596062 systemd[1]: Listening on libvirt logging daemon socket.
Jan 26 12:57:40 np0005596062 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 26 12:57:40 np0005596062 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 26 12:57:40 np0005596062 systemd[1]: Starting libvirt logging daemon...
Jan 26 12:57:41 np0005596062 systemd[1]: Started libvirt logging daemon.
Jan 26 12:57:41 np0005596062 podman[194883]: 2026-01-26 17:57:41.861872796 +0000 UTC m=+0.062066513 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Jan 26 12:57:42 np0005596062 python3.9[194882]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:57:42 np0005596062 systemd[1]: Reloading.
Jan 26 12:57:42 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:57:42 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:57:42 np0005596062 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 26 12:57:42 np0005596062 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 26 12:57:42 np0005596062 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 26 12:57:42 np0005596062 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 26 12:57:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:42.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:42 np0005596062 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 26 12:57:42 np0005596062 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 26 12:57:42 np0005596062 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 26 12:57:42 np0005596062 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 12:57:42 np0005596062 systemd[1]: Started libvirt nodedev daemon.
Jan 26 12:57:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:42.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:42 np0005596062 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 26 12:57:42 np0005596062 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 26 12:57:42 np0005596062 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 26 12:57:43 np0005596062 python3.9[195125]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:57:43 np0005596062 systemd[1]: Reloading.
Jan 26 12:57:43 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:57:43 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:57:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:43 np0005596062 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 26 12:57:43 np0005596062 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 26 12:57:43 np0005596062 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 26 12:57:43 np0005596062 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 26 12:57:43 np0005596062 systemd[1]: Starting libvirt proxy daemon...
Jan 26 12:57:43 np0005596062 systemd[1]: Started libvirt proxy daemon.
Jan 26 12:57:43 np0005596062 setroubleshoot[194937]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 5da12e20-67a2-4c5a-9071-534ae6b1d536
Jan 26 12:57:43 np0005596062 setroubleshoot[194937]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 26 12:57:43 np0005596062 setroubleshoot[194937]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 5da12e20-67a2-4c5a-9071-534ae6b1d536
Jan 26 12:57:43 np0005596062 setroubleshoot[194937]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 26 12:57:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:44.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:44.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:44 np0005596062 python3.9[195340]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:57:44 np0005596062 systemd[1]: Reloading.
Jan 26 12:57:44 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:57:44 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:57:45 np0005596062 systemd[1]: Listening on libvirt locking daemon socket.
Jan 26 12:57:45 np0005596062 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 26 12:57:45 np0005596062 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 26 12:57:45 np0005596062 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 26 12:57:45 np0005596062 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 26 12:57:45 np0005596062 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 26 12:57:45 np0005596062 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 26 12:57:45 np0005596062 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 26 12:57:45 np0005596062 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 26 12:57:45 np0005596062 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 26 12:57:45 np0005596062 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 12:57:45 np0005596062 systemd[1]: Started libvirt QEMU daemon.
Jan 26 12:57:46 np0005596062 python3.9[195555]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:57:46 np0005596062 systemd[1]: Reloading.
Jan 26 12:57:46 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:57:46 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:57:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:46.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:46 np0005596062 systemd[1]: Starting libvirt secret daemon socket...
Jan 26 12:57:46 np0005596062 systemd[1]: Listening on libvirt secret daemon socket.
Jan 26 12:57:46 np0005596062 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 26 12:57:46 np0005596062 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 26 12:57:46 np0005596062 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 26 12:57:46 np0005596062 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 26 12:57:46 np0005596062 systemd[1]: Starting libvirt secret daemon...
Jan 26 12:57:46 np0005596062 systemd[1]: Started libvirt secret daemon.
Jan 26 12:57:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:46.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:47 np0005596062 python3.9[195768]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:48 np0005596062 python3.9[195921]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 12:57:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:48.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:48.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:49 np0005596062 python3.9[196073]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:57:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:50.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:50.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:50 np0005596062 python3.9[196228]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 12:57:50 np0005596062 podman[196253]: 2026-01-26 17:57:50.96279688 +0000 UTC m=+0.160492456 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 12:57:51 np0005596062 python3.9[196404]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:52 np0005596062 python3.9[196526]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450271.0641952-3386-197356599453974/.source.xml follow=False _original_basename=secret.xml.j2 checksum=f5640975c7830314b4ada1f1cfe8314b62b47503 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:57:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:52.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:57:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:52.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:52 np0005596062 python3.9[196678]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine d4cd1917-5876-51b6-bc64-65a16199754d#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:57:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:53 np0005596062 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 26 12:57:53 np0005596062 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.012s CPU time.
Jan 26 12:57:54 np0005596062 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 26 12:57:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:54.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:54.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:55 np0005596062 python3.9[196841]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:56.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:57:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:56.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:57:58 np0005596062 python3.9[197305]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:57:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.003000079s ======
Jan 26 12:57:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:57:58.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Jan 26 12:57:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:57:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:57:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:57:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:57:58.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:57:58 np0005596062 python3.9[197458]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:57:59 np0005596062 python3.9[197581]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450278.326117-3551-31771008289935/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:00.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:00.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:00 np0005596062 python3.9[197734]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:01 np0005596062 python3.9[197936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:02 np0005596062 python3.9[198015]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:02.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:03 np0005596062 python3.9[198167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:03 np0005596062 python3.9[198245]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5nsbibsf recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:04 np0005596062 python3.9[198398]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:04.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:04 np0005596062 python3.9[198476]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:05 np0005596062 python3.9[198628]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:58:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:06.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:06 np0005596062 python3[198782]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 12:58:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:06.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:07 np0005596062 python3.9[198934]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:07 np0005596062 python3.9[199012]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:08.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:08.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:08 np0005596062 python3.9[199165]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:58:09.147 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:58:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:58:09.148 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:58:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:58:09.149 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:58:09 np0005596062 python3.9[199290]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450288.208342-3818-40610898222002/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:10 np0005596062 python3.9[199442]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:10.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:10.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:10 np0005596062 python3.9[199521]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:11 np0005596062 python3.9[199673]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:11 np0005596062 podman[199751]: 2026-01-26 17:58:11.987336064 +0000 UTC m=+0.061725514 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 12:58:12 np0005596062 python3.9[199752]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:12.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:12.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:12 np0005596062 python3.9[199922]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:13 np0005596062 python3.9[200047]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769450292.3868344-3935-35695455584332/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:14 np0005596062 python3.9[200317]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:14.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:14.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:58:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:58:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:58:15 np0005596062 python3.9[200483]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:58:16 np0005596062 python3.9[200638]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:16.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:16.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:16 np0005596062 python3.9[200791]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:58:17 np0005596062 python3.9[200944]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:58:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:18.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:18 np0005596062 python3.9[201099]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:58:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:18.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:19 np0005596062 python3.9[201254]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:20 np0005596062 python3.9[201406]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:20.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:20 np0005596062 python3.9[201530]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450299.6138494-4151-50753228751905/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:20.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:21 np0005596062 podman[201654]: 2026-01-26 17:58:21.343367065 +0000 UTC m=+0.100986522 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 12:58:21 np0005596062 python3.9[201733]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:22 np0005596062 python3.9[201882]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450300.9415674-4196-43506266975324/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:22.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:22.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:22 np0005596062 python3.9[202035]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:23 np0005596062 python3.9[202158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450302.2690008-4241-257397017973393/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:24 np0005596062 python3.9[202310]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:58:24 np0005596062 systemd[1]: Reloading.
Jan 26 12:58:24 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:58:24 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:58:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:24.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:24 np0005596062 systemd[1]: Reached target edpm_libvirt.target.
Jan 26 12:58:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:24.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:25 np0005596062 python3.9[202551]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 12:58:25 np0005596062 systemd[1]: Reloading.
Jan 26 12:58:25 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:58:25 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:58:25 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:58:25 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:58:25 np0005596062 systemd[1]: Reloading.
Jan 26 12:58:26 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:58:26 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:58:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:26.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:26.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:26 np0005596062 systemd[1]: session-48.scope: Deactivated successfully.
Jan 26 12:58:26 np0005596062 systemd[1]: session-48.scope: Consumed 3min 45.667s CPU time.
Jan 26 12:58:26 np0005596062 systemd-logind[781]: Session 48 logged out. Waiting for processes to exit.
Jan 26 12:58:26 np0005596062 systemd-logind[781]: Removed session 48.
Jan 26 12:58:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:28.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:28.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:30.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:30.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:31 np0005596062 systemd-logind[781]: New session 49 of user zuul.
Jan 26 12:58:31 np0005596062 systemd[1]: Started Session 49 of User zuul.
Jan 26 12:58:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:32.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:32.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:32 np0005596062 python3.9[202806]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:58:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:34 np0005596062 python3.9[202961]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:58:34 np0005596062 network[202978]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:58:34 np0005596062 network[202979]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:58:34 np0005596062 network[202980]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:58:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:34.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:36.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:36.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:38.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:38.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:39 np0005596062 python3.9[203255]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 12:58:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:40.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:40.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:40 np0005596062 python3.9[203340]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:58:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:42.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:42.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:42 np0005596062 podman[203393]: 2026-01-26 17:58:42.876680787 +0000 UTC m=+0.080618433 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 12:58:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:44.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:44.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:46.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:46.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:47 np0005596062 python3.9[203565]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:58:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:48.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:48.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:48 np0005596062 python3.9[203718]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:58:50 np0005596062 python3.9[203872]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:58:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:50.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:50.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:50 np0005596062 python3.9[204024]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:58:51 np0005596062 podman[204149]: 2026-01-26 17:58:51.682803765 +0000 UTC m=+0.149823384 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 26 12:58:51 np0005596062 python3.9[204190]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:58:52 np0005596062 python3.9[204325]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450331.2014332-248-80374510184746/.source.iscsi _original_basename=.phfjp38t follow=False checksum=4e799d15ff5da171478b906bce802462b403660f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:52.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:52.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:53 np0005596062 python3.9[204477]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:58:54 np0005596062 python3.9[204629]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:58:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:54.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:54.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:55 np0005596062 python3.9[204782]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:58:55 np0005596062 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 26 12:58:56 np0005596062 python3.9[204939]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:58:56 np0005596062 systemd[1]: Reloading.
Jan 26 12:58:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:56.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:56 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:58:56 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:58:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:58:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:56.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:58:56 np0005596062 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 12:58:56 np0005596062 systemd[1]: Starting Open-iSCSI...
Jan 26 12:58:56 np0005596062 kernel: Loading iSCSI transport class v2.0-870.
Jan 26 12:58:56 np0005596062 systemd[1]: Started Open-iSCSI.
Jan 26 12:58:56 np0005596062 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 26 12:58:56 np0005596062 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 26 12:58:57 np0005596062 python3.9[205138]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:58:57 np0005596062 network[205155]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:58:58 np0005596062 network[205157]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:58:58 np0005596062 network[205158]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:58:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:58:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:58:58.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:58:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:58:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:58:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:58:59.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:58:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:00.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:01.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:02.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:03.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:04 np0005596062 python3.9[205483]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:59:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:04.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:05.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:06.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:06 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:59:06 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:59:06 np0005596062 systemd[1]: Reloading.
Jan 26 12:59:06 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:59:06 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:59:07 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:59:07 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:59:07 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:59:07 np0005596062 systemd[1]: run-r270a92544f3741b0941f20cea084cbe9.service: Deactivated successfully.
Jan 26 12:59:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:07.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:08.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:08 np0005596062 python3.9[205801]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 12:59:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:59:09.147 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 12:59:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:59:09.149 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 12:59:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 17:59:09.149 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 12:59:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:09.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:09 np0005596062 python3.9[205953]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 26 12:59:10 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 12:59:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:10.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:10 np0005596062 python3.9[206111]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:59:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:11.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:11 np0005596062 python3.9[206234]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450350.2073402-512-23005580056649/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:12 np0005596062 python3.9[206387]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:12.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:13 np0005596062 podman[206511]: 2026-01-26 17:59:13.238357045 +0000 UTC m=+0.078394465 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 12:59:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:13 np0005596062 python3.9[206556]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:59:13 np0005596062 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 12:59:13 np0005596062 systemd[1]: Stopped Load Kernel Modules.
Jan 26 12:59:13 np0005596062 systemd[1]: Stopping Load Kernel Modules...
Jan 26 12:59:13 np0005596062 systemd[1]: Starting Load Kernel Modules...
Jan 26 12:59:13 np0005596062 systemd[1]: Finished Load Kernel Modules.
Jan 26 12:59:14 np0005596062 python3.9[206715]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:59:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:14.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:15 np0005596062 python3.9[206868]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:59:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:15 np0005596062 python3.9[207020]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:59:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:16.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:16 np0005596062 python3.9[207144]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450355.4496624-665-278343934841281/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:17 np0005596062 python3.9[207296]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:59:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:18 np0005596062 python3.9[207449]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:59:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:59:19 np0005596062 python3.9[207602]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:59:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:59:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:19 np0005596062 python3.9[207754]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:20 np0005596062 python3.9[207907]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:20.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:21 np0005596062 python3.9[208059]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:21 np0005596062 python3.9[208211]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:21 np0005596062 podman[208235]: 2026-01-26 17:59:21.8875697 +0000 UTC m=+0.100152069 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 12:59:22 np0005596062 python3.9[208440]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:59:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:22.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:59:23 np0005596062 python3.9[208592]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 12:59:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:23.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:23 np0005596062 python3.9[208746]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 12:59:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:24.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:24 np0005596062 python3.9[208900]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:24 np0005596062 systemd[1]: Listening on multipathd control socket.
Jan 26 12:59:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:25.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:25 np0005596062 python3.9[209156]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:26.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:26 np0005596062 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 26 12:59:26 np0005596062 udevadm[209194]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 26 12:59:26 np0005596062 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 26 12:59:26 np0005596062 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 12:59:26 np0005596062 multipathd[209197]: --------start up--------
Jan 26 12:59:26 np0005596062 multipathd[209197]: read /etc/multipath.conf
Jan 26 12:59:26 np0005596062 multipathd[209197]: path checkers start up
Jan 26 12:59:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 12:59:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:59:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 12:59:26 np0005596062 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 12:59:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:27.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:27 np0005596062 python3.9[209356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 12:59:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:28.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:28 np0005596062 python3.9[209509]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 26 12:59:28 np0005596062 kernel: Key type psk registered
Jan 26 12:59:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:29.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:29 np0005596062 python3.9[209672]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 12:59:30 np0005596062 python3.9[209795]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769450369.1063435-1055-59852565017549/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:30.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:30 np0005596062 python3.9[209948]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:31.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:31 np0005596062 python3.9[210100]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:59:31 np0005596062 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 12:59:31 np0005596062 systemd[1]: Stopped Load Kernel Modules.
Jan 26 12:59:31 np0005596062 systemd[1]: Stopping Load Kernel Modules...
Jan 26 12:59:31 np0005596062 systemd[1]: Starting Load Kernel Modules...
Jan 26 12:59:31 np0005596062 systemd[1]: Finished Load Kernel Modules.
Jan 26 12:59:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:32.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:32 np0005596062 python3.9[210257]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 12:59:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:59:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 12:59:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:33.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:34.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:35.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:35 np0005596062 systemd[1]: Reloading.
Jan 26 12:59:35 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:59:35 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:59:36 np0005596062 systemd[1]: Reloading.
Jan 26 12:59:36 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:59:36 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:59:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:59:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:36.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:59:36 np0005596062 systemd-logind[781]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 12:59:36 np0005596062 systemd-logind[781]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 12:59:36 np0005596062 lvm[210423]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 12:59:36 np0005596062 lvm[210423]: VG ceph_vg0 finished
Jan 26 12:59:36 np0005596062 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 12:59:36 np0005596062 systemd[1]: Starting man-db-cache-update.service...
Jan 26 12:59:37 np0005596062 systemd[1]: Reloading.
Jan 26 12:59:37 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:59:37 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:59:37 np0005596062 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 12:59:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:37.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:38.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:38 np0005596062 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 12:59:38 np0005596062 systemd[1]: Finished man-db-cache-update.service.
Jan 26 12:59:38 np0005596062 systemd[1]: man-db-cache-update.service: Consumed 1.710s CPU time.
Jan 26 12:59:38 np0005596062 systemd[1]: run-r600a0995107844bebbd8425495754675.service: Deactivated successfully.
Jan 26 12:59:38 np0005596062 python3.9[211779]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:59:38 np0005596062 systemd[1]: Stopping Open-iSCSI...
Jan 26 12:59:38 np0005596062 iscsid[204979]: iscsid shutting down.
Jan 26 12:59:38 np0005596062 systemd[1]: iscsid.service: Deactivated successfully.
Jan 26 12:59:38 np0005596062 systemd[1]: Stopped Open-iSCSI.
Jan 26 12:59:38 np0005596062 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 12:59:38 np0005596062 systemd[1]: Starting Open-iSCSI...
Jan 26 12:59:38 np0005596062 systemd[1]: Started Open-iSCSI.
Jan 26 12:59:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:39.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:39 np0005596062 python3.9[211935]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 12:59:39 np0005596062 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 26 12:59:39 np0005596062 multipathd[209197]: exit (signal)
Jan 26 12:59:39 np0005596062 multipathd[209197]: --------shut down-------
Jan 26 12:59:39 np0005596062 systemd[1]: multipathd.service: Deactivated successfully.
Jan 26 12:59:39 np0005596062 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 26 12:59:40 np0005596062 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 12:59:40 np0005596062 multipathd[211942]: --------start up--------
Jan 26 12:59:40 np0005596062 multipathd[211942]: read /etc/multipath.conf
Jan 26 12:59:40 np0005596062 multipathd[211942]: path checkers start up
Jan 26 12:59:40 np0005596062 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 12:59:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 12:59:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:40.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 12:59:41 np0005596062 python3.9[212099]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 12:59:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:41.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:42 np0005596062 python3.9[212284]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 12:59:42 np0005596062 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 26 12:59:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:42.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:43 np0005596062 python3.9[212459]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 12:59:43 np0005596062 systemd[1]: Reloading.
Jan 26 12:59:43 np0005596062 podman[212461]: 2026-01-26 17:59:43.396595849 +0000 UTC m=+0.083233342 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 12:59:43 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 12:59:43 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 12:59:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:43.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:43 np0005596062 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 12:59:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:44.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:44 np0005596062 python3.9[212663]: ansible-ansible.builtin.service_facts Invoked
Jan 26 12:59:44 np0005596062 network[212680]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 12:59:44 np0005596062 network[212681]: 'network-scripts' will be removed from distribution in near future.
Jan 26 12:59:44 np0005596062 network[212682]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 12:59:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:45.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:46.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:47.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:48.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:49.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:50.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:51.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:52 np0005596062 podman[212931]: 2026-01-26 17:59:52.212637597 +0000 UTC m=+0.121112664 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 12:59:52 np0005596062 python3.9[212971]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:52.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:53.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:54 np0005596062 python3.9[213138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:54.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:55 np0005596062 python3.9[213291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:55.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:55 np0005596062 python3.9[213444]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:56.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:56 np0005596062 python3.9[213598]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:57 np0005596062 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 12:59:57 np0005596062 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 26 12:59:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:57.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 12:59:57 np0005596062 python3.9[213751]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:58 np0005596062 python3.9[213907]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 12:59:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 12:59:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:17:59:58.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 12:59:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 12:59:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 12:59:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 12:59:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:17:59:59.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:00.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:01.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:02 np0005596062 python3.9[214060]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 13:00:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:02.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:03 np0005596062 python3.9[214265]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:03.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:03 np0005596062 python3.9[214417]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:03 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:00:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:04 np0005596062 python3.9[214570]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:04.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:05 np0005596062 python3.9[214722]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:05.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:05 np0005596062 python3.9[214874]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:06 np0005596062 python3.9[215027]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:06.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:06 np0005596062 python3.9[215179]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:07.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:07 np0005596062 python3.9[215331]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:08.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:08 np0005596062 python3.9[215484]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:00:09.148 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:00:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:00:09.149 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:00:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:00:09.149 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:00:09 np0005596062 python3.9[215636]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:09.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:10 np0005596062 python3.9[215788]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:10.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:10 np0005596062 python3.9[215941]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:11 np0005596062 python3.9[216093]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:11.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:12 np0005596062 python3.9[216245]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:12.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:12 np0005596062 python3.9[216398]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:13 np0005596062 python3.9[216550]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:13 np0005596062 podman[216575]: 2026-01-26 18:00:13.860151868 +0000 UTC m=+0.065091443 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 13:00:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:14 np0005596062 python3.9[216723]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:14.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.033740) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450415033810, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1625, "num_deletes": 251, "total_data_size": 3930456, "memory_usage": 3979936, "flush_reason": "Manual Compaction"}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450415051150, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2582644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16015, "largest_seqno": 17635, "table_properties": {"data_size": 2575905, "index_size": 3874, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13691, "raw_average_key_size": 19, "raw_value_size": 2562520, "raw_average_value_size": 3681, "num_data_blocks": 175, "num_entries": 696, "num_filter_entries": 696, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450248, "oldest_key_time": 1769450248, "file_creation_time": 1769450415, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 17455 microseconds, and 7454 cpu microseconds.
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.051193) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2582644 bytes OK
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.051214) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.052583) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.052595) EVENT_LOG_v1 {"time_micros": 1769450415052592, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.052615) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 3923174, prev total WAL file size 3923174, number of live WAL files 2.
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.053774) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2522KB)], [30(7688KB)]
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450415053867, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 10455251, "oldest_snapshot_seqno": -1}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4337 keys, 8395438 bytes, temperature: kUnknown
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450415118744, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 8395438, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8364652, "index_size": 18836, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 107456, "raw_average_key_size": 24, "raw_value_size": 8284388, "raw_average_value_size": 1910, "num_data_blocks": 789, "num_entries": 4337, "num_filter_entries": 4337, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450415, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.118994) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 8395438 bytes
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.120251) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.0 rd, 129.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 7.5 +0.0 blob) out(8.0 +0.0 blob), read-write-amplify(7.3) write-amplify(3.3) OK, records in: 4852, records dropped: 515 output_compression: NoCompression
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.120272) EVENT_LOG_v1 {"time_micros": 1769450415120261, "job": 16, "event": "compaction_finished", "compaction_time_micros": 64947, "compaction_time_cpu_micros": 24906, "output_level": 6, "num_output_files": 1, "total_output_size": 8395438, "num_input_records": 4852, "num_output_records": 4337, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450415120848, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450415122328, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.053543) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.122461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.122470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.122472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.122474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:15 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:15.122475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:15 np0005596062 python3.9[216875]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 13:00:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:15.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:16 np0005596062 python3.9[217028]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 13:00:16 np0005596062 systemd[1]: Reloading.
Jan 26 13:00:16 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 13:00:16 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 13:00:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:16.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:17 np0005596062 python3.9[217216]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:17.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:18 np0005596062 python3.9[217370]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:18.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:18 np0005596062 python3.9[217523]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:19 np0005596062 python3.9[217676]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:19.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:20 np0005596062 python3.9[217830]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:20.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:20 np0005596062 python3.9[217983]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:21 np0005596062 python3.9[218136]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:21.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:22 np0005596062 podman[218233]: 2026-01-26 18:00:22.510519295 +0000 UTC m=+0.178106142 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 13:00:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:22.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.615259) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450422615648, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 310, "num_deletes": 255, "total_data_size": 164179, "memory_usage": 170552, "flush_reason": "Manual Compaction"}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450422620614, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 108428, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17640, "largest_seqno": 17945, "table_properties": {"data_size": 106464, "index_size": 192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4526, "raw_average_key_size": 16, "raw_value_size": 102610, "raw_average_value_size": 367, "num_data_blocks": 9, "num_entries": 279, "num_filter_entries": 279, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450416, "oldest_key_time": 1769450416, "file_creation_time": 1769450422, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 5893 microseconds, and 1757 cpu microseconds.
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.621149) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 108428 bytes OK
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.621334) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.624102) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.624131) EVENT_LOG_v1 {"time_micros": 1769450422624121, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.624154) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 161934, prev total WAL file size 161934, number of live WAL files 2.
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.626346) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(105KB)], [33(8198KB)]
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450422626431, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 8503866, "oldest_snapshot_seqno": -1}
Jan 26 13:00:22 np0005596062 python3.9[218365]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4098 keys, 8149364 bytes, temperature: kUnknown
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450422792308, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 8149364, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8120358, "index_size": 17632, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10309, "raw_key_size": 103783, "raw_average_key_size": 25, "raw_value_size": 8044304, "raw_average_value_size": 1962, "num_data_blocks": 725, "num_entries": 4098, "num_filter_entries": 4098, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450422, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.792623) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 8149364 bytes
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.794592) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 51.2 rd, 49.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 8.0 +0.0 blob) out(7.8 +0.0 blob), read-write-amplify(153.6) write-amplify(75.2) OK, records in: 4616, records dropped: 518 output_compression: NoCompression
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.794625) EVENT_LOG_v1 {"time_micros": 1769450422794610, "job": 18, "event": "compaction_finished", "compaction_time_micros": 165968, "compaction_time_cpu_micros": 27105, "output_level": 6, "num_output_files": 1, "total_output_size": 8149364, "num_input_records": 4616, "num_output_records": 4098, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450422794837, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450422797335, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.626143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.797429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.797436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.797439) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.797442) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:22 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:00:22.797445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:00:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:23.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:24.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:25.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:25 np0005596062 python3.9[218519]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:26 np0005596062 python3.9[218672]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:26.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:27 np0005596062 python3.9[218824]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:27.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:27 np0005596062 python3.9[218976]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:28 np0005596062 python3.9[219129]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:28.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:29 np0005596062 python3.9[219281]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:29.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:29 np0005596062 python3.9[219433]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:30 np0005596062 python3.9[219586]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:30.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:31 np0005596062 python3.9[219738]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:31.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:31 np0005596062 python3.9[219890]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:32.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:33.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:34 np0005596062 podman[220088]: 2026-01-26 18:00:34.106276181 +0000 UTC m=+0.064381554 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 13:00:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:34.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:34 np0005596062 podman[220108]: 2026-01-26 18:00:34.819310372 +0000 UTC m=+0.597245919 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 26 13:00:35 np0005596062 podman[220088]: 2026-01-26 18:00:35.011239999 +0000 UTC m=+0.969345462 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 26 13:00:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:35.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:36.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:36 np0005596062 podman[220240]: 2026-01-26 18:00:36.666234246 +0000 UTC m=+0.816441467 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:00:36 np0005596062 podman[220240]: 2026-01-26 18:00:36.675106721 +0000 UTC m=+0.825313912 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:00:37 np0005596062 python3.9[220449]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 26 13:00:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:37.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:37 np0005596062 podman[220383]: 2026-01-26 18:00:37.72416705 +0000 UTC m=+0.889102019 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.28.2, release=1793, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, version=2.2.4, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9)
Jan 26 13:00:38 np0005596062 podman[220383]: 2026-01-26 18:00:38.047477292 +0000 UTC m=+1.212412261 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, architecture=x86_64, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, version=2.2.4)
Jan 26 13:00:38 np0005596062 python3.9[220621]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 13:00:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:38.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:39.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:40 np0005596062 python3.9[220904]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 13:00:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:40.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:00:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:41.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:00:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:00:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:00:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:00:42 np0005596062 systemd-logind[781]: New session 50 of user zuul.
Jan 26 13:00:42 np0005596062 systemd[1]: Started Session 50 of User zuul.
Jan 26 13:00:42 np0005596062 systemd[1]: session-50.scope: Deactivated successfully.
Jan 26 13:00:42 np0005596062 systemd-logind[781]: Session 50 logged out. Waiting for processes to exit.
Jan 26 13:00:42 np0005596062 systemd-logind[781]: Removed session 50.
Jan 26 13:00:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:42.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:43 np0005596062 python3.9[221149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:43.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:43 np0005596062 python3.9[221270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450442.5328016-2662-47334423453849/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:44 np0005596062 podman[221394]: 2026-01-26 18:00:44.065631872 +0000 UTC m=+0.078466116 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 13:00:44 np0005596062 python3.9[221430]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:44.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:44 np0005596062 python3.9[221519]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:45 np0005596062 python3.9[221669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:45.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:45 np0005596062 python3.9[221790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450444.8361082-2662-109475402330018/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:46.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:46 np0005596062 python3.9[221941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:47 np0005596062 python3.9[222062]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450446.1070879-2662-257642625041521/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:47.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:47 np0005596062 python3.9[222212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:48 np0005596062 python3.9[222334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450447.3387744-2662-201986113660751/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:48.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:49 np0005596062 python3.9[222484]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:49.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:49 np0005596062 python3.9[222605]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450448.5451758-2662-228630443599845/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:50.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:51 np0005596062 python3.9[222758]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:51.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:51 np0005596062 python3.9[222910]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:00:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:52.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:52 np0005596062 python3.9[223063]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:00:52 np0005596062 podman[223088]: 2026-01-26 18:00:52.903051416 +0000 UTC m=+0.111871260 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 26 13:00:53 np0005596062 python3.9[223242]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:53.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:53 np0005596062 python3.9[223365]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769450452.8966792-2984-266047365287729/.source _original_basename=.nd3mfp5s follow=False checksum=14e1e8f3dabbb49004d41b25d6552e8c389feb0d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 26 13:00:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:54.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:54 np0005596062 python3.9[223518]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:00:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:00:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:55.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:00:55 np0005596062 python3.9[223670]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:56 np0005596062 python3.9[223792]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450455.2110517-3061-80650674073993/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:56.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:00:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:00:57 np0005596062 python3.9[223992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 13:00:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:00:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:57.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:00:57 np0005596062 python3.9[224113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769450456.4966514-3106-144699566406110/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 13:00:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:00:58.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:58 np0005596062 python3.9[224266]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 26 13:00:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:00:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:00:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:00:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:00:59.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:00:59 np0005596062 python3.9[224418]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 13:01:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:00.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:01.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:02 np0005596062 python3[224583]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 13:01:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:02.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:03.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:04.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:05.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:06.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:07.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:08.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:01:09.148 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:01:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:01:09.150 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:01:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:01:09.150 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:01:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:09.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:01:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:10.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:01:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:11.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:01:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:12.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:01:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:14.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:14 np0005596062 podman[224595]: 2026-01-26 18:01:14.19409444 +0000 UTC m=+11.662766768 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 26 13:01:14 np0005596062 podman[224747]: 2026-01-26 18:01:14.375142865 +0000 UTC m=+0.063295284 container create 76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 13:01:14 np0005596062 podman[224747]: 2026-01-26 18:01:14.342561789 +0000 UTC m=+0.030714228 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 26 13:01:14 np0005596062 python3[224583]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 26 13:01:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:01:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:14.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:01:14 np0005596062 podman[224810]: 2026-01-26 18:01:14.876547934 +0000 UTC m=+0.082397935 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:01:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:15 np0005596062 ceph-mds[83671]: mds.beacon.cephfs.compute-2.oqvedy missed beacon ack from the monitors
Jan 26 13:01:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:16.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:16.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:18.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:18.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:19 np0005596062 ceph-mds[83671]: mds.beacon.cephfs.compute-2.oqvedy missed beacon ack from the monitors
Jan 26 13:01:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).paxos(paxos updating c 1256..1812) lease_timeout -- calling new election
Jan 26 13:01:20 np0005596062 ceph-mon[77178]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 26 13:01:20 np0005596062 ceph-mon[77178]: paxos.1).electionLogic(14) init, last seen epoch 14
Jan 26 13:01:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:01:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:20.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:01:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:22.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:01:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:22.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:23 np0005596062 podman[224882]: 2026-01-26 18:01:23.055648096 +0000 UTC m=+0.098580810 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:01:23 np0005596062 ceph-mds[83671]: mds.beacon.cephfs.compute-2.oqvedy missed beacon ack from the monitors
Jan 26 13:01:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:24.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:24.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [INF] : mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [DBG] : monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(leader) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-2.oqvedy=up:active} 2 up:standby
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [DBG] : osdmap e137: 3 total, 3 up, 3 in
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [DBG] : mgrmap e11: compute-0.mbryrf(active, since 21m), standbys: compute-2.cchxrf, compute-1.qpyzhk
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum compute-2,compute-1 (MON_DOWN)
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(leader).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum compute-2,compute-1
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum compute-2,compute-1
Jan 26 13:01:25 np0005596062 ceph-mon[77178]: log_channel(cluster) log [WRN] :     mon.compute-0 (rank 0) addr [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] is down (out of quorum)
Jan 26 13:01:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:26.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: mon.compute-1 calling monitor election
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: mon.compute-2 calling monitor election
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: Health check failed: 1/3 mons down, quorum compute-2,compute-1 (MON_DOWN)
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: Health detail: HEALTH_WARN 1/3 mons down, quorum compute-2,compute-1
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: [WRN] MON_DOWN: 1/3 mons down, quorum compute-2,compute-1
Jan 26 13:01:26 np0005596062 ceph-mon[77178]:    mon.compute-0 (rank 0) addr [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] is down (out of quorum)
Jan 26 13:01:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:01:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:26.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:01:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:01:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:01:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:28.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:28 np0005596062 ceph-mon[77178]: mon.compute-0 calling monitor election
Jan 26 13:01:28 np0005596062 ceph-mon[77178]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 26 13:01:28 np0005596062 ceph-mon[77178]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum compute-2,compute-1)
Jan 26 13:01:28 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 13:01:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:28.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:29 np0005596062 python3.9[225040]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:01:29 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:01:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:30.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:30 np0005596062 python3.9[225195]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 26 13:01:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:30.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:31 np0005596062 python3.9[225347]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 13:01:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:32.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:32 np0005596062 python3[225500]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 13:01:32 np0005596062 podman[225536]: 2026-01-26 18:01:32.628420812 +0000 UTC m=+0.058848507 container create dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, container_name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 13:01:32 np0005596062 podman[225536]: 2026-01-26 18:01:32.595470987 +0000 UTC m=+0.025898712 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 26 13:01:32 np0005596062 python3[225500]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 26 13:01:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:01:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:32.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:01:33 np0005596062 python3.9[225726]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:01:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:34.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:34 np0005596062 python3.9[225881]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:01:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:34.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:35 np0005596062 python3.9[226032]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769450494.6309688-3393-220685709565681/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 13:01:35 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:36.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:36 np0005596062 python3.9[226108]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 13:01:36 np0005596062 systemd[1]: Reloading.
Jan 26 13:01:36 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 13:01:36 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 13:01:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:36.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:37 np0005596062 python3.9[226220]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 13:01:37 np0005596062 systemd[1]: Reloading.
Jan 26 13:01:37 np0005596062 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 13:01:37 np0005596062 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 13:01:37 np0005596062 systemd[1]: Starting nova_compute container...
Jan 26 13:01:37 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:01:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:37 np0005596062 podman[226260]: 2026-01-26 18:01:37.988065525 +0000 UTC m=+0.392135649 container init dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 13:01:37 np0005596062 podman[226260]: 2026-01-26 18:01:37.999158697 +0000 UTC m=+0.403228761 container start dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + sudo -E kolla_set_configs
Jan 26 13:01:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:38.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Validating config file
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying service configuration files
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Deleting /etc/ceph
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Creating directory /etc/ceph
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Writing out command to execute
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:38 np0005596062 nova_compute[226276]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 13:01:38 np0005596062 nova_compute[226276]: ++ cat /run_command
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + CMD=nova-compute
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + ARGS=
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + sudo kolla_copy_cacerts
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + [[ ! -n '' ]]
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + . kolla_extend_start
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 13:01:38 np0005596062 nova_compute[226276]: Running command: 'nova-compute'
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + umask 0022
Jan 26 13:01:38 np0005596062 nova_compute[226276]: + exec nova-compute
Jan 26 13:01:38 np0005596062 podman[226260]: nova_compute
Jan 26 13:01:38 np0005596062 systemd[1]: Started nova_compute container.
Jan 26 13:01:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:38.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:01:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:40.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:01:40 np0005596062 python3.9[226440]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:01:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.633 226281 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.634 226281 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.634 226281 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.634 226281 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 26 13:01:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:01:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:40.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.873 226281 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.902 226281 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:01:40 np0005596062 nova_compute[226276]: 2026-01-26 18:01:40.903 226281 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 26 13:01:41 np0005596062 python3.9[226594]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.420 226281 INFO nova.virt.driver [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.563 226281 INFO nova.compute.provider_config [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.587 226281 DEBUG oslo_concurrency.lockutils [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.588 226281 DEBUG oslo_concurrency.lockutils [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.588 226281 DEBUG oslo_concurrency.lockutils [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.589 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.589 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.589 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.589 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.589 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.589 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.590 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.590 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.590 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.590 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.590 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.590 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.591 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.592 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.592 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.592 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.592 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.592 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.592 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.593 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.593 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.593 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.593 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.593 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.594 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.595 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.595 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.595 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.595 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.595 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.596 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.596 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.596 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.596 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.596 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.596 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.597 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.597 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.597 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.597 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.597 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.598 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.598 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.598 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.598 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.598 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.599 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.599 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.599 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.599 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.599 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.600 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.600 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.600 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.600 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.600 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.601 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.601 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.601 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.601 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.601 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.602 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.602 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.602 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.602 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.602 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.603 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.603 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.603 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.603 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.603 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.604 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.604 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.604 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.604 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.604 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.605 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.605 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.605 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.605 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.605 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.606 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.606 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.606 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.606 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.606 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.606 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.607 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.608 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.608 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.608 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.608 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.608 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.609 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.609 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.609 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.609 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.609 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.609 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.610 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.610 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.610 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.610 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.610 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.610 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.611 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.611 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.611 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.611 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.612 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.612 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.612 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.612 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.612 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.612 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.613 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.614 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.614 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.614 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.614 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.614 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.614 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.615 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.616 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.616 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.616 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.616 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.616 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.616 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.617 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.618 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.619 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.619 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.619 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.619 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.619 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.619 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.620 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.621 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.622 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.622 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.622 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.622 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.622 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.622 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.623 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.624 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.625 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.626 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.627 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.628 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.629 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.630 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.631 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.632 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.633 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.634 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.635 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.636 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.637 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.638 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.639 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.640 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.641 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.642 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.643 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.644 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.644 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.644 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.644 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.644 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.644 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.645 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.645 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.645 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.645 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.645 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.646 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.647 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.648 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.649 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.649 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.649 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.649 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.649 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.650 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.651 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.651 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.651 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.651 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.651 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.651 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.652 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.653 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.653 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.653 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.653 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.653 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.653 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.654 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.654 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.654 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.654 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.655 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.655 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.655 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.655 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.655 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.655 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.656 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.656 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.656 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.656 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.657 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.657 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.657 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.657 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.657 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.658 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.658 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.658 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.658 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.658 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.659 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.659 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.659 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.659 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.659 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.660 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.660 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.660 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.660 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.660 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.661 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.661 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.661 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.661 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.661 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.662 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.662 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.662 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.662 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.662 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.663 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.663 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.663 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.663 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.663 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.664 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.664 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.664 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.664 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.664 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.665 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.665 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.665 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.665 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.665 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.666 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.666 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.666 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.666 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.666 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.666 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.667 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.667 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.667 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.667 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.667 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.668 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.668 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.668 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.668 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.668 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.668 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.669 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.669 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.669 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.669 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.669 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.670 226281 WARNING oslo_config.cfg [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 13:01:41 np0005596062 nova_compute[226276]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 13:01:41 np0005596062 nova_compute[226276]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 13:01:41 np0005596062 nova_compute[226276]: and ``live_migration_inbound_addr`` respectively.
Jan 26 13:01:41 np0005596062 nova_compute[226276]: ).  Its value may be silently ignored in the future.#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.670 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.670 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.670 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.671 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.671 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.671 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.671 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.672 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.672 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.672 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.672 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.672 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.673 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.673 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.673 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.673 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.673 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.674 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.674 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rbd_secret_uuid        = d4cd1917-5876-51b6-bc64-65a16199754d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.674 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.674 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.674 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.675 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.675 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.675 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.675 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.675 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.675 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.676 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.676 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.676 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.676 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.677 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.677 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.677 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.677 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.677 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.678 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.678 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.678 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.678 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.678 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.679 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.679 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.679 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.679 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.679 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.679 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.680 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.680 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.680 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.680 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.681 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.681 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.681 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.681 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.681 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.681 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.682 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.682 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.682 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.682 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.682 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.683 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.683 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.683 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.683 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.683 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.684 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.684 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.684 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.684 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.684 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.685 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.685 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.685 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.685 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.685 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.686 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.686 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.686 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.686 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.686 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.687 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.687 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.687 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.687 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.687 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.687 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.688 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.688 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.688 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.688 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.688 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.688 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.689 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.690 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.690 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.690 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.690 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.690 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.690 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.691 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.691 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.691 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.691 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.691 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.691 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.692 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.693 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.693 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.693 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.693 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.693 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.693 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.694 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.694 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.694 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.694 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.694 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.695 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.696 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.696 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.696 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.696 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.696 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.697 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.698 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.698 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.698 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.698 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.698 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.698 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.699 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.699 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.699 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.699 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.699 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.699 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.700 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.701 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.701 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.701 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.701 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.701 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.701 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.702 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.702 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.702 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.702 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.702 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.703 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.703 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.703 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.703 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.703 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.703 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.704 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.705 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.705 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.705 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.705 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.705 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.705 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.706 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.707 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.707 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.707 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.707 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.707 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.707 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.708 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.708 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.708 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.708 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.708 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.709 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.709 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.709 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.709 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.709 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.710 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.710 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.710 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.710 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.710 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.710 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.711 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.712 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.712 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.712 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.712 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.712 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.712 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.713 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.713 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.713 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.713 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.713 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.713 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.714 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.714 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.714 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.714 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.714 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.714 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.715 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.716 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.716 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.716 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.716 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.716 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.716 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.717 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.718 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.719 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.719 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.719 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.719 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.719 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.719 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.720 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.720 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.720 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.720 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.720 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.720 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.721 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.722 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.723 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.724 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.725 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.726 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.727 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.728 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.729 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.730 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.731 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.732 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.733 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.734 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.735 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.735 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.735 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.735 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.735 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.735 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.736 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.736 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.736 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.736 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.736 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.736 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.737 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.737 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.737 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.737 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.737 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.737 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.738 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.739 226281 DEBUG oslo_service.service [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.741 226281 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.756 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.757 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.757 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.757 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 26 13:01:41 np0005596062 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 13:01:41 np0005596062 systemd[1]: Started libvirt QEMU daemon.
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.860 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fee8b063e50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.863 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fee8b063e50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.865 226281 INFO nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.878 226281 WARNING nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 26 13:01:41 np0005596062 nova_compute[226276]: 2026-01-26 18:01:41.879 226281 DEBUG nova.virt.libvirt.volume.mount [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 26 13:01:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:01:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:42.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:01:42 np0005596062 python3.9[226788]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 13:01:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:42.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.757 226281 INFO nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <host>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <uuid>5c33c4b0-14ac-46af-8c94-d3bb1b6300af</uuid>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <arch>x86_64</arch>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model>EPYC-Rome-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <vendor>AMD</vendor>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <microcode version='16777317'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <signature family='23' model='49' stepping='0'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='x2apic'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='tsc-deadline'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='osxsave'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='hypervisor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='tsc_adjust'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='spec-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='stibp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='arch-capabilities'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='cmp_legacy'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='topoext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='virt-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='lbrv'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='tsc-scale'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='vmcb-clean'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='pause-filter'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='pfthreshold'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='svme-addr-chk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='rdctl-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='skip-l1dfl-vmentry'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='mds-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature name='pschange-mc-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <pages unit='KiB' size='4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <pages unit='KiB' size='2048'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <pages unit='KiB' size='1048576'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <power_management>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <suspend_mem/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </power_management>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <iommu support='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <migration_features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <live/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <uri_transports>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <uri_transport>tcp</uri_transport>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <uri_transport>rdma</uri_transport>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </uri_transports>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </migration_features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <topology>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <cells num='1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <cell id='0'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          <memory unit='KiB'>7864308</memory>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          <pages unit='KiB' size='4'>1966077</pages>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          <pages unit='KiB' size='2048'>0</pages>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          <distances>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <sibling id='0' value='10'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          </distances>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          <cpus num='8'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:          </cpus>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        </cell>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </cells>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </topology>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <cache>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </cache>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <secmodel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model>selinux</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <doi>0</doi>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </secmodel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <secmodel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model>dac</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <doi>0</doi>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </secmodel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </host>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <guest>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <os_type>hvm</os_type>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <arch name='i686'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <wordsize>32</wordsize>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <domain type='qemu'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <domain type='kvm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </arch>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <pae/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <nonpae/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <acpi default='on' toggle='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <apic default='on' toggle='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <cpuselection/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <deviceboot/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <disksnapshot default='on' toggle='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <externalSnapshot/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </guest>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <guest>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <os_type>hvm</os_type>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <arch name='x86_64'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <wordsize>64</wordsize>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <domain type='qemu'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <domain type='kvm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </arch>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <acpi default='on' toggle='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <apic default='on' toggle='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <cpuselection/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <deviceboot/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <disksnapshot default='on' toggle='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <externalSnapshot/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </guest>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 
Jan 26 13:01:42 np0005596062 nova_compute[226276]: </capabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: #033[00m
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.769 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.800 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 13:01:42 np0005596062 nova_compute[226276]: <domainCapabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <domain>kvm</domain>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <arch>i686</arch>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <vcpu max='240'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <iothreads supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <os supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <enum name='firmware'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <loader supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>rom</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pflash</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='readonly'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>yes</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='secure'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </loader>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </os>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='maximumMigratable'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <vendor>AMD</vendor>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='succor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='custom' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v6'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v7'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='athlon'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='athlon-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='core2duo'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='core2duo-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='coreduo'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='coreduo-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='n270'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='n270-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='phenom'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='phenom-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <memoryBacking supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <enum name='sourceType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>file</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>anonymous</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>memfd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </memoryBacking>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <devices>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <disk supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='diskDevice'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>disk</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>cdrom</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>floppy</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>lun</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>ide</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>fdc</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>sata</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </disk>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <graphics supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vnc</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>egl-headless</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </graphics>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <video supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='modelType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vga</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>cirrus</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>none</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>bochs</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>ramfb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </video>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <hostdev supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='mode'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>subsystem</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='startupPolicy'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>mandatory</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>requisite</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>optional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='subsysType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pci</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='capsType'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='pciBackend'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </hostdev>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <rng supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>random</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>egd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </rng>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <filesystem supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='driverType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>path</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>handle</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtiofs</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </filesystem>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <tpm supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tpm-tis</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tpm-crb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>emulator</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>external</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendVersion'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>2.0</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </tpm>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <redirdev supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </redirdev>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <channel supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </channel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <crypto supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>qemu</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </crypto>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <interface supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>passt</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </interface>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <panic supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>isa</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>hyperv</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </panic>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <console supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>null</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vc</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>dev</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>file</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pipe</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>stdio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>udp</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tcp</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>qemu-vdagent</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </console>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </devices>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <gic supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <vmcoreinfo supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <genid supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <backingStoreInput supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <backup supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <async-teardown supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <s390-pv supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <ps2 supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <tdx supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <sev supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <sgx supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <hyperv supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='features'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>relaxed</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vapic</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>spinlocks</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vpindex</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>runtime</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>synic</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>stimer</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>reset</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vendor_id</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>frequencies</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>reenlightenment</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tlbflush</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>ipi</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>avic</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>emsr_bitmap</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>xmm_input</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <defaults>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <spinlocks>4095</spinlocks>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <stimer_direct>on</stimer_direct>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </defaults>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </hyperv>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <launchSecurity supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: </domainCapabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.811 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 13:01:42 np0005596062 nova_compute[226276]: <domainCapabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <domain>kvm</domain>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <arch>i686</arch>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <vcpu max='4096'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <iothreads supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <os supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <enum name='firmware'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <loader supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>rom</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pflash</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='readonly'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>yes</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='secure'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </loader>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </os>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='maximumMigratable'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <vendor>AMD</vendor>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='succor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='custom' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v6'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v7'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v5'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='athlon'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='athlon-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='core2duo'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='core2duo-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='coreduo'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='coreduo-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='n270'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='n270-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='phenom'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='phenom-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <memoryBacking supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <enum name='sourceType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>file</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>anonymous</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>memfd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </memoryBacking>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <devices>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <disk supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='diskDevice'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>disk</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>cdrom</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>floppy</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>lun</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>fdc</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>sata</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </disk>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <graphics supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vnc</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>egl-headless</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </graphics>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <video supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='modelType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vga</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>cirrus</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>none</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>bochs</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>ramfb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </video>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <hostdev supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='mode'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>subsystem</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='startupPolicy'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>mandatory</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>requisite</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>optional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='subsysType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pci</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='capsType'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='pciBackend'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </hostdev>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <rng supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>random</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>egd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </rng>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <filesystem supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='driverType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>path</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>handle</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>virtiofs</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </filesystem>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <tpm supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tpm-tis</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tpm-crb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>emulator</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>external</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendVersion'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>2.0</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </tpm>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <redirdev supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </redirdev>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <channel supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </channel>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <crypto supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>qemu</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </crypto>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <interface supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='backendType'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>passt</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </interface>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <panic supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>isa</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>hyperv</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </panic>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <console supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>null</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vc</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>dev</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>file</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pipe</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>stdio</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>udp</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tcp</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>qemu-vdagent</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </console>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </devices>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <gic supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <vmcoreinfo supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <genid supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <backingStoreInput supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <backup supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <async-teardown supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <s390-pv supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <ps2 supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <tdx supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <sev supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <sgx supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <hyperv supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='features'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>relaxed</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vapic</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>spinlocks</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vpindex</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>runtime</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>synic</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>stimer</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>reset</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>vendor_id</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>frequencies</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>reenlightenment</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>tlbflush</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>ipi</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>avic</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>emsr_bitmap</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>xmm_input</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <defaults>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <spinlocks>4095</spinlocks>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <stimer_direct>on</stimer_direct>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </defaults>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </hyperv>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <launchSecurity supported='no'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </features>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: </domainCapabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.879 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 13:01:42 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.886 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 13:01:42 np0005596062 nova_compute[226276]: <domainCapabilities>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <domain>kvm</domain>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <arch>x86_64</arch>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <vcpu max='240'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <iothreads supported='yes'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <os supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <enum name='firmware'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <loader supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>rom</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>pflash</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='readonly'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>yes</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='secure'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </loader>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  </os>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:  <cpu>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <enum name='maximumMigratable'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <vendor>AMD</vendor>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='succor'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:    <mode name='custom' supported='yes'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:42 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v6'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v7'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='athlon'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='athlon-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='core2duo'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='core2duo-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='coreduo'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='coreduo-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='n270'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='n270-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='phenom'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='phenom-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </cpu>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <memoryBacking supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <enum name='sourceType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>file</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>anonymous</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>memfd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </memoryBacking>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <devices>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <disk supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='diskDevice'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>disk</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>cdrom</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>floppy</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>lun</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>ide</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>fdc</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>sata</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </disk>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <graphics supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vnc</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>egl-headless</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </graphics>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <video supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='modelType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vga</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>cirrus</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>none</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>bochs</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>ramfb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </video>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <hostdev supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='mode'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>subsystem</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='startupPolicy'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>mandatory</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>requisite</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>optional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='subsysType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pci</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='capsType'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='pciBackend'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </hostdev>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <rng supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>random</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>egd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </rng>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <filesystem supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='driverType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>path</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>handle</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtiofs</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </filesystem>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <tpm supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tpm-tis</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tpm-crb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>emulator</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>external</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendVersion'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>2.0</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </tpm>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <redirdev supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </redirdev>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <channel supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </channel>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <crypto supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>qemu</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </crypto>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <interface supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>passt</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </interface>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <panic supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>isa</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>hyperv</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </panic>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <console supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>null</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vc</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>dev</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>file</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pipe</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>stdio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>udp</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tcp</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>qemu-vdagent</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </console>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </devices>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <features>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <gic supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <vmcoreinfo supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <genid supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <backingStoreInput supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <backup supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <async-teardown supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <s390-pv supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <ps2 supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <tdx supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <sev supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <sgx supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <hyperv supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='features'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>relaxed</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vapic</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>spinlocks</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vpindex</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>runtime</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>synic</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>stimer</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>reset</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vendor_id</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>frequencies</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>reenlightenment</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tlbflush</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>ipi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>avic</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>emsr_bitmap</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>xmm_input</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <defaults>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <spinlocks>4095</spinlocks>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <stimer_direct>on</stimer_direct>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </defaults>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </hyperv>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <launchSecurity supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </features>
Jan 26 13:01:43 np0005596062 nova_compute[226276]: </domainCapabilities>
Jan 26 13:01:43 np0005596062 nova_compute[226276]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:42.980 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 13:01:43 np0005596062 nova_compute[226276]: <domainCapabilities>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <domain>kvm</domain>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <arch>x86_64</arch>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <vcpu max='4096'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <iothreads supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <os supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <enum name='firmware'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>efi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <loader supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>rom</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pflash</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='readonly'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>yes</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='secure'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>yes</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>no</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </loader>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </os>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <cpu>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='maximumMigratable'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>on</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>off</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <vendor>AMD</vendor>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='succor'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <mode name='custom' supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ddpd-u'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sha512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm3'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sm4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Denverton-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Milan-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Rome-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-Turin-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amd-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='auto-ibrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='perfmon-v2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbpb'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='stibp-always-on'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='EPYC-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='GraniteRapids-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-128'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-256'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx10-512'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='prefetchiti'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Haswell-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v6'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Icelake-Server-v7'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='IvyBridge-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='KnightsMill-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512er'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512pf'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G4-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Opteron_G5-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fma4'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tbm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xop'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SapphireRapids-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='amx-tile'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-bf16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-fp16'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bitalg'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrc'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fzrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='la57'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='taa-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='SierraForest-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ifma'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cmpccxadd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fbsdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='fsrs'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ibrs-all'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='intel-psfd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='lam'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mcdt-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pbrsb-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='psdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='serialize'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vaes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Client-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='hle'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='rtm'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Skylake-Server-v5'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512bw'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512cd'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512dq'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512f'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='avx512vl'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='invpcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pcid'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='pku'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='mpx'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v2'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v3'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='core-capability'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='split-lock-detect'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='Snowridge-v4'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='cldemote'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='erms'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='gfni'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdir64b'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='movdiri'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='xsaves'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='athlon'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='athlon-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='core2duo'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='core2duo-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='coreduo'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='coreduo-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='n270'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='n270-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='ss'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='phenom'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <blockers model='phenom-v1'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnow'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <feature name='3dnowext'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </blockers>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </mode>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </cpu>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <memoryBacking supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <enum name='sourceType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>file</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>anonymous</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <value>memfd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </memoryBacking>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <devices>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <disk supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='diskDevice'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>disk</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>cdrom</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>floppy</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>lun</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>fdc</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>sata</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </disk>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <graphics supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vnc</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>egl-headless</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </graphics>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <video supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='modelType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vga</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>cirrus</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>none</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>bochs</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>ramfb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </video>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <hostdev supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='mode'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>subsystem</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='startupPolicy'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>mandatory</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>requisite</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>optional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='subsysType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pci</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>scsi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='capsType'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='pciBackend'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </hostdev>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <rng supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtio-non-transitional</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>random</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>egd</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </rng>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <filesystem supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='driverType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>path</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>handle</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>virtiofs</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </filesystem>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <tpm supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tpm-tis</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tpm-crb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>emulator</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>external</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendVersion'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>2.0</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </tpm>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <redirdev supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='bus'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>usb</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </redirdev>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <channel supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </channel>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <crypto supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>qemu</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendModel'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>builtin</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </crypto>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <interface supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='backendType'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>default</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>passt</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </interface>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <panic supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='model'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>isa</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>hyperv</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </panic>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <console supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='type'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>null</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vc</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pty</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>dev</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>file</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>pipe</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>stdio</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>udp</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tcp</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>unix</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>qemu-vdagent</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>dbus</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </console>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </devices>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <features>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <gic supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <vmcoreinfo supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <genid supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <backingStoreInput supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <backup supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <async-teardown supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <s390-pv supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <ps2 supported='yes'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <tdx supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <sev supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <sgx supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <hyperv supported='yes'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <enum name='features'>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>relaxed</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vapic</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>spinlocks</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vpindex</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>runtime</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>synic</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>stimer</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>reset</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>vendor_id</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>frequencies</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>reenlightenment</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>tlbflush</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>ipi</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>avic</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>emsr_bitmap</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <value>xmm_input</value>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </enum>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      <defaults>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <spinlocks>4095</spinlocks>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <stimer_direct>on</stimer_direct>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:      </defaults>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    </hyperv>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:    <launchSecurity supported='no'/>
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  </features>
Jan 26 13:01:43 np0005596062 nova_compute[226276]: </domainCapabilities>
Jan 26 13:01:43 np0005596062 nova_compute[226276]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.069 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.069 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.070 226281 DEBUG nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.077 226281 INFO nova.virt.libvirt.host [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Secure Boot support detected#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.081 226281 INFO nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.081 226281 INFO nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.094 226281 DEBUG nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 26 13:01:43 np0005596062 nova_compute[226276]:  <model>Nehalem</model>
Jan 26 13:01:43 np0005596062 nova_compute[226276]: </cpu>
Jan 26 13:01:43 np0005596062 nova_compute[226276]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.097 226281 DEBUG nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.135 226281 INFO nova.virt.node [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Determined node identity 65600a65-69bc-488c-8c8c-71cbf43e523a from /var/lib/nova/compute_id#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.162 226281 WARNING nova.compute.manager [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Compute nodes ['65600a65-69bc-488c-8c8c-71cbf43e523a'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.194 226281 INFO nova.compute.manager [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.235 226281 WARNING nova.compute.manager [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.236 226281 DEBUG oslo_concurrency.lockutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.236 226281 DEBUG oslo_concurrency.lockutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.237 226281 DEBUG oslo_concurrency.lockutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.237 226281 DEBUG nova.compute.resource_tracker [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.238 226281 DEBUG oslo_concurrency.processutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:01:43 np0005596062 python3.9[226961]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 13:01:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:01:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3454 writes, 18K keys, 3454 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3454 writes, 3454 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1304 writes, 6509 keys, 1304 commit groups, 1.0 writes per commit group, ingest: 13.69 MB, 0.02 MB/s#012Interval WAL: 1304 writes, 1304 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     55.6      0.40              0.08         9    0.045       0      0       0.0       0.0#012  L6      1/0    7.77 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0     54.8     45.1      1.48              0.23         8    0.186     36K   4367       0.0       0.0#012 Sum      1/0    7.77 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     43.2     47.3      1.89              0.31        17    0.111     36K   4367       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.0     40.6     40.8      1.30              0.19        10    0.130     24K   3086       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     54.8     45.1      1.48              0.23         8    0.186     36K   4367       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     55.8      0.40              0.08         8    0.050       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.022, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.09 GB write, 0.07 MB/s write, 0.08 GB read, 0.07 MB/s read, 1.9 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 1.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 4.71 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 9.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(256,4.40 MB,1.44627%) FilterBlock(17,109.48 KB,0.0351705%) IndexBlock(17,214.67 KB,0.0689607%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 13:01:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:01:43 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2609637507' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:01:43 np0005596062 nova_compute[226276]: 2026-01-26 18:01:43.671 226281 DEBUG oslo_concurrency.processutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:01:43 np0005596062 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 13:01:43 np0005596062 systemd[1]: Started libvirt nodedev daemon.
Jan 26 13:01:43 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 13:01:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:01:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:44.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.076 226281 WARNING nova.virt.libvirt.driver [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.079 226281 DEBUG nova.compute.resource_tracker [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5251MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.080 226281 DEBUG oslo_concurrency.lockutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.080 226281 DEBUG oslo_concurrency.lockutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.105 226281 WARNING nova.compute.resource_tracker [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] No compute node record for compute-2.ctlplane.example.com:65600a65-69bc-488c-8c8c-71cbf43e523a: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 65600a65-69bc-488c-8c8c-71cbf43e523a could not be found.#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.134 226281 INFO nova.compute.resource_tracker [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 65600a65-69bc-488c-8c8c-71cbf43e523a#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.188 226281 DEBUG nova.compute.resource_tracker [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.189 226281 DEBUG nova.compute.resource_tracker [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:01:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:44.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.827 226281 INFO nova.scheduler.client.report [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] [req-6ee16716-2fa0-4b68-a08f-2ac7735dd883] Created resource provider record via placement API for resource provider with UUID 65600a65-69bc-488c-8c8c-71cbf43e523a and name compute-2.ctlplane.example.com.#033[00m
Jan 26 13:01:44 np0005596062 python3.9[227229]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.861 226281 DEBUG oslo_concurrency.processutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:01:44 np0005596062 systemd[1]: Stopping nova_compute container...
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.954 226281 DEBUG oslo_concurrency.lockutils [None req-9bec2bd6-f210-4852-8c9d-7e1c36fd2645 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.955 226281 DEBUG oslo_concurrency.lockutils [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.955 226281 DEBUG oslo_concurrency.lockutils [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:01:44 np0005596062 nova_compute[226276]: 2026-01-26 18:01:44.955 226281 DEBUG oslo_concurrency.lockutils [None req-a2f4e471-2382-4312-883d-0fe47e18f295 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:01:44 np0005596062 podman[227235]: 2026-01-26 18:01:44.975555643 +0000 UTC m=+0.064137405 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:01:45 np0005596062 virtqemud[226715]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 13:01:45 np0005596062 virtqemud[226715]: hostname: compute-2
Jan 26 13:01:45 np0005596062 virtqemud[226715]: End of file while reading data: Input/output error
Jan 26 13:01:45 np0005596062 systemd[1]: libpod-dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e.scope: Deactivated successfully.
Jan 26 13:01:45 np0005596062 systemd[1]: libpod-dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e.scope: Consumed 4.360s CPU time.
Jan 26 13:01:45 np0005596062 podman[227234]: 2026-01-26 18:01:45.372404435 +0000 UTC m=+0.462601619 container died dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Jan 26 13:01:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e-userdata-shm.mount: Deactivated successfully.
Jan 26 13:01:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay-b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644-merged.mount: Deactivated successfully.
Jan 26 13:01:45 np0005596062 podman[227234]: 2026-01-26 18:01:45.458449405 +0000 UTC m=+0.548646559 container cleanup dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 26 13:01:45 np0005596062 podman[227234]: nova_compute
Jan 26 13:01:45 np0005596062 podman[227283]: nova_compute
Jan 26 13:01:45 np0005596062 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 26 13:01:45 np0005596062 systemd[1]: Stopped nova_compute container.
Jan 26 13:01:45 np0005596062 systemd[1]: Starting nova_compute container...
Jan 26 13:01:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:01:45 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:01:45 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:45 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:45 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:45 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:45 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92f847bda04240a69dd3df027fe32cdc3559355e777795c44185bd8123a9644/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:45 np0005596062 podman[227296]: 2026-01-26 18:01:45.650449218 +0000 UTC m=+0.089166763 container init dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 13:01:45 np0005596062 podman[227296]: 2026-01-26 18:01:45.661059476 +0000 UTC m=+0.099777001 container start dbeddebf709836b8cf20c25f15874acea5c6bee40063df894b5147fe9c876e6e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 13:01:45 np0005596062 podman[227296]: nova_compute
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + sudo -E kolla_set_configs
Jan 26 13:01:45 np0005596062 systemd[1]: Started nova_compute container.
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Validating config file
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying service configuration files
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /etc/ceph
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Creating directory /etc/ceph
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Writing out command to execute
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:45 np0005596062 nova_compute[227313]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 13:01:45 np0005596062 nova_compute[227313]: ++ cat /run_command
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + CMD=nova-compute
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + ARGS=
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + sudo kolla_copy_cacerts
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + [[ ! -n '' ]]
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + . kolla_extend_start
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 13:01:45 np0005596062 nova_compute[227313]: Running command: 'nova-compute'
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + umask 0022
Jan 26 13:01:45 np0005596062 nova_compute[227313]: + exec nova-compute
Jan 26 13:01:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:46.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:01:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:46.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:01:47 np0005596062 python3.9[227478]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 13:01:47 np0005596062 systemd[1]: Started libpod-conmon-76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e.scope.
Jan 26 13:01:47 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:01:47 np0005596062 nova_compute[227313]: 2026-01-26 18:01:47.865 227317 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 13:01:47 np0005596062 nova_compute[227313]: 2026-01-26 18:01:47.867 227317 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 13:01:47 np0005596062 nova_compute[227313]: 2026-01-26 18:01:47.867 227317 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 13:01:47 np0005596062 nova_compute[227313]: 2026-01-26 18:01:47.867 227317 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 26 13:01:47 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/620dfa25d4c7df5c7fbdb25c10baab93c843f0af65c89a8199c41de132c0e715/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:47 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/620dfa25d4c7df5c7fbdb25c10baab93c843f0af65c89a8199c41de132c0e715/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:47 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/620dfa25d4c7df5c7fbdb25c10baab93c843f0af65c89a8199c41de132c0e715/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 26 13:01:47 np0005596062 podman[227503]: 2026-01-26 18:01:47.894317601 +0000 UTC m=+0.146893129 container init 76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 13:01:47 np0005596062 podman[227503]: 2026-01-26 18:01:47.90341905 +0000 UTC m=+0.155994578 container start 76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 26 13:01:47 np0005596062 python3.9[227478]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Applying nova statedir ownership
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 26 13:01:47 np0005596062 nova_compute_init[227527]: INFO:nova_statedir:Nova statedir ownership complete
Jan 26 13:01:47 np0005596062 systemd[1]: libpod-76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e.scope: Deactivated successfully.
Jan 26 13:01:47 np0005596062 podman[227541]: 2026-01-26 18:01:47.995781066 +0000 UTC m=+0.026047785 container died 76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.016 227317 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:01:48 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e-userdata-shm.mount: Deactivated successfully.
Jan 26 13:01:48 np0005596062 systemd[1]: var-lib-containers-storage-overlay-620dfa25d4c7df5c7fbdb25c10baab93c843f0af65c89a8199c41de132c0e715-merged.mount: Deactivated successfully.
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.040 227317 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.041 227317 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 26 13:01:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:01:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:01:48.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:01:48 np0005596062 podman[227541]: 2026-01-26 18:01:48.220991971 +0000 UTC m=+0.251258690 container cleanup 76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:01:48 np0005596062 systemd[1]: libpod-conmon-76ed68bd163228ff492216c059402bcfa8679b5e851c70de4e36a581320ca11e.scope: Deactivated successfully.
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.517 227317 INFO nova.virt.driver [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.626 227317 INFO nova.compute.provider_config [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 26 13:01:48 np0005596062 systemd-logind[781]: Session 49 logged out. Waiting for processes to exit.
Jan 26 13:01:48 np0005596062 systemd[1]: session-49.scope: Deactivated successfully.
Jan 26 13:01:48 np0005596062 systemd[1]: session-49.scope: Consumed 2min 9.486s CPU time.
Jan 26 13:01:48 np0005596062 systemd-logind[781]: Removed session 49.
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.648 227317 DEBUG oslo_concurrency.lockutils [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.648 227317 DEBUG oslo_concurrency.lockutils [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.648 227317 DEBUG oslo_concurrency.lockutils [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.649 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.649 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.649 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.649 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.650 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.650 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.650 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.650 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.650 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.650 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.651 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.651 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.651 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.651 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.651 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.652 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.652 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.652 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.652 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.652 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.652 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.653 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.653 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.653 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.653 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.654 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.654 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.654 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.654 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.655 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.656 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.656 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.656 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.656 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.656 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.656 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.657 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.658 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.659 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.660 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.661 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.662 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.663 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.664 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.665 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.666 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.666 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.666 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.666 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.666 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.667 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.667 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:01:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:01:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:01:48.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.667 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.667 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.667 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.668 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.668 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.668 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.668 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.668 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.668 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.669 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.669 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.669 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.669 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.669 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.669 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.670 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.671 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.672 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.673 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.673 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.673 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.673 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.673 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.674 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.674 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.674 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.674 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.674 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.675 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.676 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.676 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.676 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.676 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.676 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.676 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.677 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.677 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.677 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.677 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.677 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.677 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.678 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.678 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.678 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.678 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.678 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.678 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.679 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.679 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.679 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.679 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.679 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.679 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.680 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.681 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.682 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.683 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.684 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.684 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.684 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.684 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.684 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.684 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.685 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.686 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.686 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.686 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.686 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.686 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.686 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.687 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.687 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.687 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.687 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.687 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.687 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.688 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.688 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.688 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.688 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.688 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.688 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.689 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.689 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.689 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.689 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.689 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.689 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.690 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.690 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.690 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.690 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.690 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.691 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.691 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.691 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.691 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.691 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.692 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.692 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.692 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.692 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.692 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.693 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.693 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.693 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.693 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.693 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.694 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.694 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.694 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.694 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.694 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.694 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.695 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.696 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.696 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.696 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.696 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.696 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.697 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.698 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.698 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.698 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.698 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.698 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.698 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.699 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.699 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.699 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.699 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.699 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.699 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.700 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.700 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.700 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.700 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.700 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.700 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.701 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.701 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.701 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.701 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.701 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.701 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.702 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.703 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.703 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.703 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.703 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.703 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.703 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.704 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.705 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.705 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.705 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.705 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.705 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.705 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.706 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.706 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.706 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.706 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.706 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.707 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.708 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.709 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.710 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.711 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.712 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.713 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.714 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.715 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.716 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.716 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.716 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.716 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.716 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.716 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.717 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.718 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.719 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.720 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.721 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.721 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.721 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.721 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.721 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.721 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.722 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.723 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.723 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.723 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.723 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.723 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.723 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.724 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.725 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.726 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.726 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.726 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.726 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.726 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.726 227317 WARNING oslo_config.cfg [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 13:01:48 np0005596062 nova_compute[227313]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 13:01:48 np0005596062 nova_compute[227313]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 13:01:48 np0005596062 nova_compute[227313]: and ``live_migration_inbound_addr`` respectively.
Jan 26 13:01:48 np0005596062 nova_compute[227313]: ).  Its value may be silently ignored in the future.#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.727 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.727 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.727 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.727 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.727 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.727 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.728 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rbd_secret_uuid        = d4cd1917-5876-51b6-bc64-65a16199754d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.729 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.730 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.731 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.731 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.731 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.731 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.731 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.731 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.732 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.733 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.733 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.733 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.733 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.733 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.733 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.734 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.735 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.736 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.737 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.737 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.737 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.737 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.737 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.738 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.739 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.739 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.739 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.739 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.739 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.739 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.740 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.740 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.740 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.740 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.740 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.740 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.741 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.742 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.743 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.743 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.743 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.743 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.743 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.743 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.744 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.745 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.745 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.745 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.745 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.745 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.746 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.747 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.748 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.748 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.748 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.748 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.748 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.749 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.750 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.751 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.752 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.753 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.753 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.753 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.753 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.753 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.753 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.754 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.755 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.756 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.756 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.756 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.756 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.756 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.757 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.757 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.757 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.757 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.758 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.758 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.758 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.758 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.758 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.759 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.759 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.759 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.759 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.759 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.759 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.760 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.760 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.760 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.760 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.760 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.761 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.761 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.761 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.761 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.761 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.762 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.763 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.763 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.763 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.763 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.763 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.763 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.764 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.765 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.765 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.765 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.765 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.765 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.765 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.766 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.766 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.766 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.766 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.766 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.767 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.768 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.768 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.768 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.768 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.768 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.768 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.769 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.769 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.769 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.769 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.769 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.770 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.770 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.770 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.770 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.770 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.770 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.771 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.771 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.771 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.771 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.771 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.772 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.772 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.772 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.772 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.772 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.772 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.773 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.774 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.774 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.774 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.774 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.774 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.775 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.776 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.777 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.777 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.777 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.777 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.777 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.777 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.778 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.779 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.780 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.781 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.782 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.783 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.783 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.783 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.783 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.783 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.783 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.784 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.785 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.786 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.787 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.788 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.789 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.789 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.789 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.789 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.789 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.789 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.790 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.791 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.792 227317 DEBUG oslo_service.service [None req-fc80f5d2-a589-4a92-b79b-0cd402ba9479 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.793 227317 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.818 227317 INFO nova.virt.node [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Determined node identity 65600a65-69bc-488c-8c8c-71cbf43e523a from /var/lib/nova/compute_id#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.819 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.819 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.820 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.820 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.831 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd730427a60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.834 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd730427a60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.834 227317 INFO nova.virt.libvirt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.841 227317 INFO nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <host>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <uuid>5c33c4b0-14ac-46af-8c94-d3bb1b6300af</uuid>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <cpu>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <arch>x86_64</arch>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model>EPYC-Rome-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <vendor>AMD</vendor>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <microcode version='16777317'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <signature family='23' model='49' stepping='0'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='x2apic'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='tsc-deadline'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='osxsave'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='hypervisor'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='tsc_adjust'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='spec-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='stibp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='arch-capabilities'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='cmp_legacy'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='topoext'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='virt-ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='lbrv'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='tsc-scale'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='vmcb-clean'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='pause-filter'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='pfthreshold'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='svme-addr-chk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='rdctl-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='skip-l1dfl-vmentry'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='mds-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature name='pschange-mc-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <pages unit='KiB' size='4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <pages unit='KiB' size='2048'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <pages unit='KiB' size='1048576'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </cpu>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <power_management>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <suspend_mem/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </power_management>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <iommu support='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <migration_features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <live/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <uri_transports>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <uri_transport>tcp</uri_transport>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <uri_transport>rdma</uri_transport>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </uri_transports>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </migration_features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <topology>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <cells num='1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <cell id='0'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          <memory unit='KiB'>7864308</memory>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          <pages unit='KiB' size='4'>1966077</pages>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          <pages unit='KiB' size='2048'>0</pages>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          <distances>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <sibling id='0' value='10'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          </distances>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          <cpus num='8'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:          </cpus>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        </cell>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </cells>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </topology>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <cache>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </cache>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <secmodel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model>selinux</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <doi>0</doi>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </secmodel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <secmodel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model>dac</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <doi>0</doi>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </secmodel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </host>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <guest>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <os_type>hvm</os_type>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <arch name='i686'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <wordsize>32</wordsize>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <domain type='qemu'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <domain type='kvm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </arch>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <pae/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <nonpae/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <acpi default='on' toggle='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <apic default='on' toggle='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <cpuselection/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <deviceboot/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <disksnapshot default='on' toggle='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <externalSnapshot/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </guest>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <guest>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <os_type>hvm</os_type>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <arch name='x86_64'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <wordsize>64</wordsize>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <domain type='qemu'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <domain type='kvm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </arch>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <acpi default='on' toggle='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <apic default='on' toggle='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <cpuselection/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <deviceboot/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <disksnapshot default='on' toggle='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <externalSnapshot/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </guest>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 
Jan 26 13:01:48 np0005596062 nova_compute[227313]: </capabilities>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: #033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.847 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.851 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 13:01:48 np0005596062 nova_compute[227313]: <domainCapabilities>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <domain>kvm</domain>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <arch>i686</arch>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <vcpu max='4096'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <iothreads supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <os supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <enum name='firmware'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <loader supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>rom</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>pflash</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='readonly'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>yes</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>no</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='secure'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>no</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </loader>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <cpu>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>on</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>off</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='maximumMigratable'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>on</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>off</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <vendor>AMD</vendor>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='succor'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='custom' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ddpd-u'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sha512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ddpd-u'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sha512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='perfmon-v2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Turin'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='perfmon-v2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbpb'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Turin-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='perfmon-v2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbpb'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-128'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-256'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-128'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-256'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v6'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v7'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='KnightsMill'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512er'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512pf'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='KnightsMill-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512er'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512pf'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G4-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tbm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G5-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tbm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Snowridge'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='athlon'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='athlon-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='core2duo'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='core2duo-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='coreduo'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='coreduo-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='n270'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='n270-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='phenom'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='phenom-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <memoryBacking supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <enum name='sourceType'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <value>file</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <value>anonymous</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <value>memfd</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </memoryBacking>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <disk supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='diskDevice'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>disk</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>cdrom</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>floppy</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>lun</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='bus'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>fdc</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>scsi</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>usb</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>sata</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio-transitional</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio-non-transitional</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <graphics supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>vnc</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>egl-headless</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>dbus</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </graphics>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <video supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='modelType'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>vga</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>cirrus</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>none</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>bochs</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>ramfb</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <hostdev supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='mode'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>subsystem</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='startupPolicy'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>default</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>mandatory</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>requisite</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>optional</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='subsysType'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>usb</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>pci</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>scsi</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='capsType'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='pciBackend'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </hostdev>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <rng supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio-transitional</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtio-non-transitional</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='backendModel'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>random</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>egd</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>builtin</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <filesystem supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='driverType'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>path</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>handle</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>virtiofs</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </filesystem>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <tpm supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>tpm-tis</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>tpm-crb</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='backendModel'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>emulator</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>external</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='backendVersion'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>2.0</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </tpm>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <redirdev supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='bus'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>usb</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </redirdev>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <channel supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>pty</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>unix</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </channel>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <crypto supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='model'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>qemu</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='backendModel'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>builtin</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </crypto>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <interface supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='backendType'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>default</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>passt</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <panic supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>isa</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>hyperv</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </panic>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <console supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>null</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>vc</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>pty</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>dev</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>file</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>pipe</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>stdio</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>udp</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>tcp</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>unix</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>qemu-vdagent</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>dbus</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </console>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <gic supported='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <vmcoreinfo supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <genid supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <backingStoreInput supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <backup supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <async-teardown supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <s390-pv supported='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <ps2 supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <tdx supported='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <sev supported='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <sgx supported='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <hyperv supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='features'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>relaxed</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>vapic</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>spinlocks</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>vpindex</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>runtime</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>synic</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>stimer</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>reset</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>vendor_id</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>frequencies</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>reenlightenment</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>tlbflush</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>ipi</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>avic</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>emsr_bitmap</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>xmm_input</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <defaults>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <spinlocks>4095</spinlocks>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <stimer_direct>on</stimer_direct>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </defaults>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </hyperv>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <launchSecurity supported='no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: </domainCapabilities>
Jan 26 13:01:48 np0005596062 nova_compute[227313]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.858 227317 DEBUG nova.virt.libvirt.volume.mount [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 26 13:01:48 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.859 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 13:01:48 np0005596062 nova_compute[227313]: <domainCapabilities>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <domain>kvm</domain>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <arch>i686</arch>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <vcpu max='240'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <iothreads supported='yes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <os supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <enum name='firmware'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <loader supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>rom</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>pflash</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='readonly'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>yes</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>no</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='secure'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>no</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </loader>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:  <cpu>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>on</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>off</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <enum name='maximumMigratable'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>on</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <value>off</value>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <vendor>AMD</vendor>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='succor'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:    <mode name='custom' supported='yes'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ddpd-u'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sha512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ddpd-u'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sha512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm3'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sm4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='perfmon-v2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Milan-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Rome-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Turin'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='perfmon-v2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbpb'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Turin-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vp2intersect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fs-gs-base-ns'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibpb-brtype'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='perfmon-v2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbpb'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='srso-user-kernel-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='EPYC-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-128'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-256'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='GraniteRapids-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-128'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-256'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx10-512'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Haswell-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v6'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Icelake-Server-v7'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge-IBRS'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='IvyBridge-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='KnightsMill'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512er'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512pf'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='KnightsMill-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4fmaps'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-4vnniw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512er'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512pf'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G4-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G5'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tbm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='Opteron_G5-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fma4'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tbm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xop'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SapphireRapids-v4'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='amx-tile'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-fp16'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrc'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fzrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='tsx-ldtrk'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest-v1'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest-v2'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:      <blockers model='SierraForest-v3'>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:48 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v3'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Client-v4'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v3'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v4'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Skylake-Server-v5'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Snowridge'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v3'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='core-capability'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='split-lock-detect'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Snowridge-v4'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='athlon'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='athlon-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='core2duo'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='core2duo-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='coreduo'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='coreduo-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='n270'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='n270-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='phenom'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='phenom-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnow'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='3dnowext'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <memoryBacking supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <enum name='sourceType'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>file</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>anonymous</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>memfd</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  </memoryBacking>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <disk supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='diskDevice'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>disk</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>cdrom</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>floppy</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>lun</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='bus'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>ide</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>fdc</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>scsi</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>usb</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>sata</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio-transitional</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio-non-transitional</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <graphics supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>vnc</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>egl-headless</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>dbus</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </graphics>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <video supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='modelType'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>vga</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>cirrus</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>none</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>bochs</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>ramfb</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <hostdev supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='mode'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>subsystem</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='startupPolicy'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>default</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>mandatory</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>requisite</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>optional</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='subsysType'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>usb</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>pci</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>scsi</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='capsType'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='pciBackend'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </hostdev>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <rng supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio-transitional</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtio-non-transitional</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='backendModel'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>random</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>egd</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>builtin</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <filesystem supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='driverType'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>path</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>handle</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>virtiofs</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </filesystem>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <tpm supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>tpm-tis</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>tpm-crb</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='backendModel'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>emulator</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>external</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='backendVersion'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>2.0</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </tpm>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <redirdev supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='bus'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>usb</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </redirdev>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <channel supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>pty</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>unix</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </channel>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <crypto supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='model'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>qemu</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='backendModel'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>builtin</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </crypto>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <interface supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='backendType'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>default</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>passt</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <panic supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='model'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>isa</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>hyperv</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </panic>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <console supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>null</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>vc</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>pty</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>dev</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>file</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>pipe</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>stdio</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>udp</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>tcp</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>unix</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>qemu-vdagent</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>dbus</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </console>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <gic supported='no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <vmcoreinfo supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <genid supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <backingStoreInput supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <backup supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <async-teardown supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <s390-pv supported='no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <ps2 supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <tdx supported='no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <sev supported='no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <sgx supported='no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <hyperv supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='features'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>relaxed</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>vapic</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>spinlocks</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>vpindex</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>runtime</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>synic</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>stimer</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>reset</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>vendor_id</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>frequencies</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>reenlightenment</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>tlbflush</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>ipi</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>avic</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>emsr_bitmap</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>xmm_input</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <defaults>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <spinlocks>4095</spinlocks>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <stimer_direct>on</stimer_direct>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </defaults>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </hyperv>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <launchSecurity supported='no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:01:49 np0005596062 nova_compute[227313]: </domainCapabilities>
Jan 26 13:01:49 np0005596062 nova_compute[227313]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 13:01:49 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.918 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 13:01:49 np0005596062 nova_compute[227313]: 2026-01-26 18:01:48.923 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 13:01:49 np0005596062 nova_compute[227313]: <domainCapabilities>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <domain>kvm</domain>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <arch>x86_64</arch>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <vcpu max='4096'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <iothreads supported='yes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <os supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <enum name='firmware'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>efi</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <loader supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='type'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>rom</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>pflash</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='readonly'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>yes</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>no</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='secure'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>yes</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>no</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </loader>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:  <cpu>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <mode name='host-passthrough' supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='hostPassthroughMigratable'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>on</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>off</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <mode name='maximum' supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <enum name='maximumMigratable'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>on</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <value>off</value>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </enum>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <mode name='host-model' supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <vendor>AMD</vendor>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='x2apic'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='hypervisor'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='stibp'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='ssbd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='overflow-recov'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='succor'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='ibrs'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='lbrv'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='tsc-scale'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='flushbyasid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='pause-filter'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='pfthreshold'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <feature policy='disable' name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    </mode>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:    <mode name='custom' supported='yes'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-IBRS'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-noTSX'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v3'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Broadwell-v4'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='ClearwaterForest'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='bhi-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ddpd-u'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sha512'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sm3'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sm4'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='ClearwaterForest-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-ifma'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-ne-convert'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx-vnni-int8'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='bhi-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='bhi-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='bus-lock-detect'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cldemote'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='cmpccxadd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ddpd-u'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fbsdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fsrs'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='intel-psfd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ipred-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='lam'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mcdt-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdir64b'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='movdiri'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pbrsb-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='prefetchiti'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='psdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rrsba-ctrl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sbdr-ssdp-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='serialize'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sha512'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sm3'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='sm4'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ss'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Cooperlake-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='hle'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='ibrs-all'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='rtm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='taa-no'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Denverton'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='mpx'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Denverton-v3'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='Dhyana-v2'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512f'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512ifma'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vbmi2'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vl'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512vnni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='erms'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='fsrm'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='gfni'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='invpcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='la57'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='no-nested-data-bp'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='null-sel-clr-base'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pcid'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='pku'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='stibp-always-on'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vaes'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='vpclmulqdq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='xsaves'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      </blockers>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='amd-psfd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='auto-ibrs'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-bf16'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512-vpopcntdq'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bitalg'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512bw'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512cd'/>
Jan 26 13:01:49 np0005596062 nova_compute[227313]:        <feature name='avx512dq'/>
Jan 26 13:02:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:02:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:36.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:02:36 np0005596062 rsyslogd[1005]: imjournal: 3879 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 26 13:02:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:36.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:38.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:38.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:40.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:02:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:40.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:02:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:42.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:02:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:42.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:02:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:44.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:02:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:02:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:44.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:02:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:02:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:46.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:46.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:46 np0005596062 podman[228092]: 2026-01-26 18:02:46.862806436 +0000 UTC m=+0.072469304 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.053 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.054 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.054 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.054 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:02:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:48.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.185 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.186 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.186 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.186 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.186 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.187 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.187 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.187 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.187 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.239 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.240 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.240 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.240 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.240 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:02:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:02:48 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3626267478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.678 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:02:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.833 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.834 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5309MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.834 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:02:48 np0005596062 nova_compute[227313]: 2026-01-26 18:02:48.835 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.217 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.217 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.241 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:02:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:02:49 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/846303989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.669 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.675 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.713 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.777 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:02:49 np0005596062 nova_compute[227313]: 2026-01-26 18:02:49.778 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:02:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:50.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:02:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:02:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:02:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:52.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:52.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:54 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 13:02:54 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 26 13:02:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:54.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:54 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 26 13:02:55 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 26 13:02:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:02:55 np0005596062 podman[228159]: 2026-01-26 18:02:55.902213307 +0000 UTC m=+0.106401605 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true)
Jan 26 13:02:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:56.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:56.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:02:58.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:02:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:02:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:02:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:02:58.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:00.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:00.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:02.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:02.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:04.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:04.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:06.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:06.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:08.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:08.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:03:09.151 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:03:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:03:09.152 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:03:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:03:09.152 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:03:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 13:03:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 13:03:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:03:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:03:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:10.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:10.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:03:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:03:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:03:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:12.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:12.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:14.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:14.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:16.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:16.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:17 np0005596062 podman[228500]: 2026-01-26 18:03:17.892823552 +0000 UTC m=+0.092059090 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 13:03:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:18.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:18.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:20.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:03:20 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:03:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:03:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:20.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:03:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:22.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:22.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:24.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:24.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:26.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:26 np0005596062 podman[228628]: 2026-01-26 18:03:26.918819166 +0000 UTC m=+0.118260585 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:03:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:28.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:28.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:30.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:30.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:32.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:32.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:34.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:34.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:35 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:36.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:36.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:38.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:38.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:03:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:40.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:03:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:40.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:42.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:42.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:44.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:44.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:46.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:46.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:48.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:48 np0005596062 podman[228716]: 2026-01-26 18:03:48.876106459 +0000 UTC m=+0.072204770 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.769 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.770 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.817 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.818 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.818 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.819 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.819 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.819 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.851 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.851 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.852 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.853 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:03:49 np0005596062 nova_compute[227313]: 2026-01-26 18:03:49.853 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:03:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:50.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:03:50 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/342325034' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.359 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.549 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.550 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5302MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.550 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.551 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.632 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.633 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:03:50 np0005596062 nova_compute[227313]: 2026-01-26 18:03:50.657 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:03:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:50.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:03:51 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4174924310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.111 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.118 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.143 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.145 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.145 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.377 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.377 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.378 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.401 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.402 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:51 np0005596062 nova_compute[227313]: 2026-01-26 18:03:51.402 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:03:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:52.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:52.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:54.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:54.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:03:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:03:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:56.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:03:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:56.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:03:57 np0005596062 podman[228783]: 2026-01-26 18:03:57.945350915 +0000 UTC m=+0.145815717 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 13:03:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:03:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:03:58.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:03:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:03:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:03:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:03:58.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:00.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:00.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:01 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:04:01.478 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:04:01 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:04:01.479 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:04:01 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:04:01.480 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:04:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:04:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:02.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:04:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:02.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:04.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:04.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:06.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:06.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:08.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:08.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:04:09.152 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:04:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:04:09.153 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:04:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:04:09.153 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:04:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:10.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:04:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:10.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:04:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:12.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:12.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:14.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:14.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:16.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:16.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:18.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:18.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:19 np0005596062 podman[228894]: 2026-01-26 18:04:19.679145487 +0000 UTC m=+0.075748463 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 26 13:04:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:04:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:20.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.682424) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450660682466, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2363, "num_deletes": 251, "total_data_size": 5930527, "memory_usage": 6004304, "flush_reason": "Manual Compaction"}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450660722657, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3872519, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17950, "largest_seqno": 20308, "table_properties": {"data_size": 3862943, "index_size": 6070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19757, "raw_average_key_size": 20, "raw_value_size": 3843611, "raw_average_value_size": 3962, "num_data_blocks": 270, "num_entries": 970, "num_filter_entries": 970, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450422, "oldest_key_time": 1769450422, "file_creation_time": 1769450660, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 40320 microseconds, and 8507 cpu microseconds.
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.722740) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3872519 bytes OK
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.722760) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.733774) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.733824) EVENT_LOG_v1 {"time_micros": 1769450660733813, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.733848) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 5920215, prev total WAL file size 5920215, number of live WAL files 2.
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.735751) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3781KB)], [36(7958KB)]
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450660735886, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 12021883, "oldest_snapshot_seqno": -1}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4541 keys, 9945476 bytes, temperature: kUnknown
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450660845618, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9945476, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9912188, "index_size": 20826, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 113660, "raw_average_key_size": 25, "raw_value_size": 9826995, "raw_average_value_size": 2164, "num_data_blocks": 863, "num_entries": 4541, "num_filter_entries": 4541, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450660, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.846203) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9945476 bytes
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.856753) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.3 rd, 90.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 7.8 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(5.7) write-amplify(2.6) OK, records in: 5068, records dropped: 527 output_compression: NoCompression
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.856794) EVENT_LOG_v1 {"time_micros": 1769450660856777, "job": 20, "event": "compaction_finished", "compaction_time_micros": 109961, "compaction_time_cpu_micros": 30318, "output_level": 6, "num_output_files": 1, "total_output_size": 9945476, "num_input_records": 5068, "num_output_records": 4541, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450660858415, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450660861279, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.735413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.861393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.861401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.861404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.861407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:04:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:04:20.861410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:04:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:20.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 13:04:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:04:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:04:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:04:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:22.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:22.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:24.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:24.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:26.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:26.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:28.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:28.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:28 np0005596062 podman[229077]: 2026-01-26 18:04:28.913045569 +0000 UTC m=+0.112932455 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:04:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:04:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:04:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:30.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:30.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:32.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:32.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:34.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:34.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:36.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:36.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:38.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:38.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:40.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:40.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:42.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:42.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:44.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:04:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:44.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:04:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:46.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:46.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:48 np0005596062 nova_compute[227313]: 2026-01-26 18:04:48.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:48.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:48.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:49 np0005596062 nova_compute[227313]: 2026-01-26 18:04:49.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:49 np0005596062 podman[229215]: 2026-01-26 18:04:49.899135742 +0000 UTC m=+0.105554393 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.080 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.080 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.081 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.082 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.082 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.083 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.114 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.115 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.116 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.116 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.117 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:04:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:50.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:04:50 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/155194947' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.583 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.776 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.778 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5331MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.778 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.778 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.857 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.858 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:04:50 np0005596062 nova_compute[227313]: 2026-01-26 18:04:50.875 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:04:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:50.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:04:51 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/703931430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:04:51 np0005596062 nova_compute[227313]: 2026-01-26 18:04:51.383 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:04:51 np0005596062 nova_compute[227313]: 2026-01-26 18:04:51.389 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:04:51 np0005596062 nova_compute[227313]: 2026-01-26 18:04:51.576 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:04:51 np0005596062 nova_compute[227313]: 2026-01-26 18:04:51.579 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:04:51 np0005596062 nova_compute[227313]: 2026-01-26 18:04:51.580 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:04:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:52.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:52 np0005596062 nova_compute[227313]: 2026-01-26 18:04:52.551 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:52 np0005596062 nova_compute[227313]: 2026-01-26 18:04:52.552 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:04:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:52.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:54.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:54.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:04:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:04:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:56.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:04:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:56.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:04:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:04:58.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:04:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:04:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:04:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:04:58.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:04:59 np0005596062 podman[229283]: 2026-01-26 18:04:59.938774669 +0000 UTC m=+0.142726115 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 13:05:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:00.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:00.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:02.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:04.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:06.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:06.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:08.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:08.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:05:09.154 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:05:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:05:09.154 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:05:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:05:09.155 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:05:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:10.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:12.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:12.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:14.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:14.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:16.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:16.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:18.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:18.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=404 latency=0.002000053s ======
Jan 26 13:05:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:20.129 +0000] "GET /healthcheck HTTP/1.1" 404 240 - "python-urllib3/1.26.5" - latency=0.002000053s
Jan 26 13:05:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:20.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:20 np0005596062 podman[229371]: 2026-01-26 18:05:20.864577342 +0000 UTC m=+0.073899115 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 26 13:05:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:20.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:22.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:22.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 26 13:05:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:24.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:24.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 26 13:05:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:26.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:26.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 26 13:05:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:28.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:28.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:30.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:30 np0005596062 podman[229577]: 2026-01-26 18:05:30.942291059 +0000 UTC m=+0.144700448 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 13:05:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:30.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:05:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:05:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:05:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 26 13:05:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:32.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:32.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 26 13:05:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:34.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:34.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:36.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:36.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:38.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:38.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:05:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:05:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:40.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:40.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:42.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000052s ======
Jan 26 13:05:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:42.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Jan 26 13:05:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:44.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:44.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:46.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:46.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:48.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:05:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:48.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:05:50 np0005596062 nova_compute[227313]: 2026-01-26 18:05:50.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:50 np0005596062 nova_compute[227313]: 2026-01-26 18:05:50.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:50 np0005596062 nova_compute[227313]: 2026-01-26 18:05:50.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:50 np0005596062 nova_compute[227313]: 2026-01-26 18:05:50.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:50 np0005596062 nova_compute[227313]: 2026-01-26 18:05:50.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:05:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:50.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:50 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:05:50.772 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:05:50 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:05:50.775 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:05:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:50.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.067 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.067 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.067 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.081 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.081 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.104 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.104 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.105 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.105 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.105 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:05:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:05:51 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/643369921' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.575 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.755 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.756 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5294MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.756 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.756 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.835 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.836 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:05:51 np0005596062 podman[229735]: 2026-01-26 18:05:51.838855702 +0000 UTC m=+0.048365490 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 13:05:51 np0005596062 nova_compute[227313]: 2026-01-26 18:05:51.850 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:05:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:05:52 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3557179529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:05:52 np0005596062 nova_compute[227313]: 2026-01-26 18:05:52.314 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:05:52 np0005596062 nova_compute[227313]: 2026-01-26 18:05:52.322 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:05:52 np0005596062 nova_compute[227313]: 2026-01-26 18:05:52.344 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:05:52 np0005596062 nova_compute[227313]: 2026-01-26 18:05:52.348 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:05:52 np0005596062 nova_compute[227313]: 2026-01-26 18:05:52.348 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:05:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:52.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:52.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:53 np0005596062 nova_compute[227313]: 2026-01-26 18:05:53.319 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:53 np0005596062 nova_compute[227313]: 2026-01-26 18:05:53.319 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:53 np0005596062 nova_compute[227313]: 2026-01-26 18:05:53.320 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:05:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:54.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:54.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:05:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:56.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:56.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 26 13:05:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:05:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:05:58.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:05:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:05:58.777 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:05:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:05:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:05:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:05:58.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:05:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 26 13:06:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:00.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:00.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:01 np0005596062 podman[229780]: 2026-01-26 18:06:01.913276741 +0000 UTC m=+0.114328774 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 26 13:06:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:02.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:03.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:04.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:05.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:06:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:06.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:06:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:07.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.051 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.052 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.072 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.167 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.168 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.175 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.175 227317 INFO nova.compute.claims [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.286 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:08.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 26 13:06:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:06:08 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3533681138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.757 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.765 227317 DEBUG nova.compute.provider_tree [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.801 227317 DEBUG nova.scheduler.client.report [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.835 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.836 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.895 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.895 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.922 227317 INFO nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:06:08 np0005596062 nova_compute[227313]: 2026-01-26 18:06:08.950 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:06:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:06:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:09.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.038 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.040 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.041 227317 INFO nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Creating image(s)#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.089 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.127 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:09.155 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:09.156 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:09.157 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.175 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.181 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.183 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:09 np0005596062 nova_compute[227313]: 2026-01-26 18:06:09.984 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Automatically allocating a network for project 0edb4019e89c4674848ec75122984916. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Jan 26 13:06:10 np0005596062 nova_compute[227313]: 2026-01-26 18:06:10.108 227317 DEBUG nova.virt.libvirt.imagebackend [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Image locations are: [{'url': 'rbd://d4cd1917-5876-51b6-bc64-65a16199754d/images/57de5960-c1c5-4cfa-af34-8f58cf25f585/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://d4cd1917-5876-51b6-bc64-65a16199754d/images/57de5960-c1c5-4cfa-af34-8f58cf25f585/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 26 13:06:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:10.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:11.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:11 np0005596062 nova_compute[227313]: 2026-01-26 18:06:11.801 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:11 np0005596062 nova_compute[227313]: 2026-01-26 18:06:11.862 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:11 np0005596062 nova_compute[227313]: 2026-01-26 18:06:11.863 227317 DEBUG nova.virt.images [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] 57de5960-c1c5-4cfa-af34-8f58cf25f585 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 26 13:06:11 np0005596062 nova_compute[227313]: 2026-01-26 18:06:11.864 227317 DEBUG nova.privsep.utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 26 13:06:11 np0005596062 nova_compute[227313]: 2026-01-26 18:06:11.864 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.part /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.134 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.part /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.converted" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.139 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.228 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216.converted --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.230 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.269 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.274 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:12.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.670 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.749 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] resizing rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.885 227317 DEBUG nova.objects.instance [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lazy-loading 'migration_context' on Instance uuid 76fb1ebb-6b94-4c1e-96b6-352821eff2cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.903 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.903 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Ensure instance console log exists: /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.904 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.905 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:12 np0005596062 nova_compute[227313]: 2026-01-26 18:06:12.906 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:13.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:06:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:14.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:06:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:15.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:15 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 13:06:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:16.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:06:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:17.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:06:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:18.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:06:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:19.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:06:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:20.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:21.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:21 np0005596062 nova_compute[227313]: 2026-01-26 18:06:21.347 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Automatically allocated network: {'id': '0233ae30-2e5a-4e12-9142-37047ec40cce', 'name': 'auto_allocated_network', 'tenant_id': '0edb4019e89c4674848ec75122984916', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['67039721-515c-4cde-ae20-7ece9fb11b87', 'f031f499-16a3-416c-9fdf-487a31751487'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-26T18:06:10Z', 'updated_at': '2026-01-26T18:06:20Z', 'revision_number': 4, 'project_id': '0edb4019e89c4674848ec75122984916'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Jan 26 13:06:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:21 np0005596062 nova_compute[227313]: 2026-01-26 18:06:21.368 227317 WARNING oslo_policy.policy [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 26 13:06:21 np0005596062 nova_compute[227313]: 2026-01-26 18:06:21.369 227317 WARNING oslo_policy.policy [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 26 13:06:21 np0005596062 nova_compute[227313]: 2026-01-26 18:06:21.373 227317 DEBUG nova.policy [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44d840a696d1433d91d7424baebdfd6b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0edb4019e89c4674848ec75122984916', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:06:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:22.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:22 np0005596062 nova_compute[227313]: 2026-01-26 18:06:22.878 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Successfully created port: 36635dd9-db93-4788-a953-82e84680a474 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:06:22 np0005596062 podman[230068]: 2026-01-26 18:06:22.888938596 +0000 UTC m=+0.079649867 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 13:06:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:23.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.626162) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450783626259, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1491, "num_deletes": 250, "total_data_size": 3311722, "memory_usage": 3352960, "flush_reason": "Manual Compaction"}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450783638087, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1347920, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20313, "largest_seqno": 21799, "table_properties": {"data_size": 1342900, "index_size": 2352, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12765, "raw_average_key_size": 20, "raw_value_size": 1331908, "raw_average_value_size": 2141, "num_data_blocks": 105, "num_entries": 622, "num_filter_entries": 622, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450660, "oldest_key_time": 1769450660, "file_creation_time": 1769450783, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 11985 microseconds, and 4202 cpu microseconds.
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.638153) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1347920 bytes OK
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.638177) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.640188) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.640217) EVENT_LOG_v1 {"time_micros": 1769450783640209, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.640238) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 3304827, prev total WAL file size 3304827, number of live WAL files 2.
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.641429) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353032' seq:72057594037927935, type:22 .. '6D67727374617400373533' seq:0, type:0; will stop at (end)
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1316KB)], [39(9712KB)]
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450783641522, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 11293396, "oldest_snapshot_seqno": -1}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4700 keys, 8272800 bytes, temperature: kUnknown
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450783728304, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 8272800, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8241274, "index_size": 18680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 117415, "raw_average_key_size": 24, "raw_value_size": 8155958, "raw_average_value_size": 1735, "num_data_blocks": 769, "num_entries": 4700, "num_filter_entries": 4700, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450783, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.728555) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 8272800 bytes
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.730137) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.0 rd, 95.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 9.5 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(14.5) write-amplify(6.1) OK, records in: 5163, records dropped: 463 output_compression: NoCompression
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.730154) EVENT_LOG_v1 {"time_micros": 1769450783730145, "job": 22, "event": "compaction_finished", "compaction_time_micros": 86882, "compaction_time_cpu_micros": 34708, "output_level": 6, "num_output_files": 1, "total_output_size": 8272800, "num_input_records": 5163, "num_output_records": 4700, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450783730529, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450783732155, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.641210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.732264) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.732271) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.732273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.732275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:06:23 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:06:23.732277) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:06:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:24.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:24 np0005596062 nova_compute[227313]: 2026-01-26 18:06:24.796 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Successfully updated port: 36635dd9-db93-4788-a953-82e84680a474 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:06:24 np0005596062 nova_compute[227313]: 2026-01-26 18:06:24.820 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "refresh_cache-76fb1ebb-6b94-4c1e-96b6-352821eff2cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:06:24 np0005596062 nova_compute[227313]: 2026-01-26 18:06:24.821 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquired lock "refresh_cache-76fb1ebb-6b94-4c1e-96b6-352821eff2cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:06:24 np0005596062 nova_compute[227313]: 2026-01-26 18:06:24.821 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:06:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:25.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:25 np0005596062 nova_compute[227313]: 2026-01-26 18:06:25.196 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:06:25 np0005596062 nova_compute[227313]: 2026-01-26 18:06:25.497 227317 DEBUG nova.compute.manager [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-changed-36635dd9-db93-4788-a953-82e84680a474 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:06:25 np0005596062 nova_compute[227313]: 2026-01-26 18:06:25.497 227317 DEBUG nova.compute.manager [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Refreshing instance network info cache due to event network-changed-36635dd9-db93-4788-a953-82e84680a474. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:06:25 np0005596062 nova_compute[227313]: 2026-01-26 18:06:25.498 227317 DEBUG oslo_concurrency.lockutils [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-76fb1ebb-6b94-4c1e-96b6-352821eff2cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:06:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:26.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:27.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.476 227317 DEBUG nova.network.neutron [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Updating instance_info_cache with network_info: [{"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.500 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Releasing lock "refresh_cache-76fb1ebb-6b94-4c1e-96b6-352821eff2cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.501 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Instance network_info: |[{"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.502 227317 DEBUG oslo_concurrency.lockutils [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-76fb1ebb-6b94-4c1e-96b6-352821eff2cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.502 227317 DEBUG nova.network.neutron [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Refreshing network info cache for port 36635dd9-db93-4788-a953-82e84680a474 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.508 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Start _get_guest_xml network_info=[{"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.515 227317 WARNING nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.526 227317 DEBUG nova.virt.libvirt.host [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.527 227317 DEBUG nova.virt.libvirt.host [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.531 227317 DEBUG nova.virt.libvirt.host [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.532 227317 DEBUG nova.virt.libvirt.host [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.534 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.535 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.536 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.536 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.537 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.537 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.537 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.538 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.538 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.539 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.540 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.540 227317 DEBUG nova.virt.hardware [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.546 227317 DEBUG nova.privsep.utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 26 13:06:27 np0005596062 nova_compute[227313]: 2026-01-26 18:06:27.546 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:06:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3946842906' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.060 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.097 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.102 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:06:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3560277162' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.521 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.524 227317 DEBUG nova.virt.libvirt.vif [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1013927775-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1013927775-1',id=2,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0edb4019e89c4674848ec75122984916',ramdisk_id='',reservation_id='r-872mjeag',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1369791216',owner_user_name='tempest-AutoAllocateNetworkTest-1369791216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:06:08Z,user_data=None,user_id='44d840a696d1433d91d7424baebdfd6b',uuid=76fb1ebb-6b94-4c1e-96b6-352821eff2cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.525 227317 DEBUG nova.network.os_vif_util [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Converting VIF {"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.528 227317 DEBUG nova.network.os_vif_util [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.531 227317 DEBUG nova.objects.instance [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lazy-loading 'pci_devices' on Instance uuid 76fb1ebb-6b94-4c1e-96b6-352821eff2cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.546 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <uuid>76fb1ebb-6b94-4c1e-96b6-352821eff2cc</uuid>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <name>instance-00000002</name>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:name>tempest-tempest.common.compute-instance-1013927775-1</nova:name>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:06:27</nova:creationTime>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:user uuid="44d840a696d1433d91d7424baebdfd6b">tempest-AutoAllocateNetworkTest-1369791216-project-member</nova:user>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:project uuid="0edb4019e89c4674848ec75122984916">tempest-AutoAllocateNetworkTest-1369791216</nova:project>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <nova:port uuid="36635dd9-db93-4788-a953-82e84680a474">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.1.0.4" ipVersion="4"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="fdfe:381f:8400::5" ipVersion="6"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <entry name="serial">76fb1ebb-6b94-4c1e-96b6-352821eff2cc</entry>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <entry name="uuid">76fb1ebb-6b94-4c1e-96b6-352821eff2cc</entry>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk.config">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:32:26:2b"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <target dev="tap36635dd9-db"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/console.log" append="off"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:06:28 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:06:28 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:06:28 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:06:28 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.548 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Preparing to wait for external event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.549 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.549 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.549 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.550 227317 DEBUG nova.virt.libvirt.vif [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1013927775-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1013927775-1',id=2,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0edb4019e89c4674848ec75122984916',ramdisk_id='',reservation_id='r-872mjeag',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1369791216',owner_user_name='tempest-AutoAllocateNetworkTest-1369791216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:06:08Z,user_data=None,user_id='44d840a696d1433d91d7424baebdfd6b',uuid=76fb1ebb-6b94-4c1e-96b6-352821eff2cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.550 227317 DEBUG nova.network.os_vif_util [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Converting VIF {"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.551 227317 DEBUG nova.network.os_vif_util [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.551 227317 DEBUG os_vif [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:06:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:28.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.582 227317 DEBUG ovsdbapp.backend.ovs_idl [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.583 227317 DEBUG ovsdbapp.backend.ovs_idl [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.583 227317 DEBUG ovsdbapp.backend.ovs_idl [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.584 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.586 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.586 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.587 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.601 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.601 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:06:28 np0005596062 nova_compute[227313]: 2026-01-26 18:06:28.602 227317 INFO oslo.privsep.daemon [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpd3jcdv8n/privsep.sock']#033[00m
Jan 26 13:06:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:29.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.374 227317 INFO oslo.privsep.daemon [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.235 230207 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.242 230207 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.246 230207 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.246 230207 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230207#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.519 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.744 227317 DEBUG nova.network.neutron [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Updated VIF entry in instance network info cache for port 36635dd9-db93-4788-a953-82e84680a474. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.745 227317 DEBUG nova.network.neutron [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Updating instance_info_cache with network_info: [{"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.751 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.751 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36635dd9-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.752 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36635dd9-db, col_values=(('external_ids', {'iface-id': '36635dd9-db93-4788-a953-82e84680a474', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:26:2b', 'vm-uuid': '76fb1ebb-6b94-4c1e-96b6-352821eff2cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:29 np0005596062 NetworkManager[48993]: <info>  [1769450789.7575] manager: (tap36635dd9-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.758 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.767 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.770 227317 INFO os_vif [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db')#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.772 227317 DEBUG oslo_concurrency.lockutils [req-17d2f506-78f9-4be3-9929-8be3644ed7a2 req-4428af68-c6ee-4f22-b2c9-d2838a38824c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-76fb1ebb-6b94-4c1e-96b6-352821eff2cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.821 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.821 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.821 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] No VIF found with MAC fa:16:3e:32:26:2b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.822 227317 INFO nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Using config drive#033[00m
Jan 26 13:06:29 np0005596062 nova_compute[227313]: 2026-01-26 18:06:29.850 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:30.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:30.999 227317 INFO nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Creating config drive at /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/disk.config#033[00m
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.005 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2xdfzyq3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:31.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.147 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2xdfzyq3" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.186 227317 DEBUG nova.storage.rbd_utils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] rbd image 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.193 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/disk.config 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.419 227317 DEBUG oslo_concurrency.processutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/disk.config 76fb1ebb-6b94-4c1e-96b6-352821eff2cc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.420 227317 INFO nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Deleting local config drive /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc/disk.config because it was imported into RBD.#033[00m
Jan 26 13:06:31 np0005596062 systemd[1]: Starting libvirt secret daemon...
Jan 26 13:06:31 np0005596062 systemd[1]: Started libvirt secret daemon.
Jan 26 13:06:31 np0005596062 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 26 13:06:31 np0005596062 kernel: tap36635dd9-db: entered promiscuous mode
Jan 26 13:06:31 np0005596062 NetworkManager[48993]: <info>  [1769450791.5669] manager: (tap36635dd9-db): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 26 13:06:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:31Z|00027|binding|INFO|Claiming lport 36635dd9-db93-4788-a953-82e84680a474 for this chassis.
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.605 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:31Z|00028|binding|INFO|36635dd9-db93-4788-a953-82e84680a474: Claiming fa:16:3e:32:26:2b 10.1.0.4 fdfe:381f:8400::5
Jan 26 13:06:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:31.626 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:26:2b 10.1.0.4 fdfe:381f:8400::5'], port_security=['fa:16:3e:32:26:2b 10.1.0.4 fdfe:381f:8400::5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.4/26 fdfe:381f:8400::5/64', 'neutron:device_id': '76fb1ebb-6b94-4c1e-96b6-352821eff2cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0233ae30-2e5a-4e12-9142-37047ec40cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0edb4019e89c4674848ec75122984916', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b8b977b7-e75f-401b-bfd0-7066aad28c16', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4f9fdf8-90b6-44b5-be73-6e7a7109730a, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=36635dd9-db93-4788-a953-82e84680a474) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:06:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:31.628 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 36635dd9-db93-4788-a953-82e84680a474 in datapath 0233ae30-2e5a-4e12-9142-37047ec40cce bound to our chassis#033[00m
Jan 26 13:06:31 np0005596062 systemd-udevd[230308]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:06:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:31.631 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0233ae30-2e5a-4e12-9142-37047ec40cce#033[00m
Jan 26 13:06:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:31.632 143929 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp6gby9mr6/privsep.sock']#033[00m
Jan 26 13:06:31 np0005596062 NetworkManager[48993]: <info>  [1769450791.6541] device (tap36635dd9-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:06:31 np0005596062 NetworkManager[48993]: <info>  [1769450791.6548] device (tap36635dd9-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:06:31 np0005596062 systemd-machined[195380]: New machine qemu-1-instance-00000002.
Jan 26 13:06:31 np0005596062 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.703 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:31Z|00029|binding|INFO|Setting lport 36635dd9-db93-4788-a953-82e84680a474 ovn-installed in OVS
Jan 26 13:06:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:31Z|00030|binding|INFO|Setting lport 36635dd9-db93-4788-a953-82e84680a474 up in Southbound
Jan 26 13:06:31 np0005596062 nova_compute[227313]: 2026-01-26 18:06:31.712 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.230 227317 DEBUG nova.compute.manager [req-238cfcd5-5855-4d7b-9b9a-c649b0f19e0d req-965ec404-d6f4-444a-8288-224152d2e30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.230 227317 DEBUG oslo_concurrency.lockutils [req-238cfcd5-5855-4d7b-9b9a-c649b0f19e0d req-965ec404-d6f4-444a-8288-224152d2e30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.231 227317 DEBUG oslo_concurrency.lockutils [req-238cfcd5-5855-4d7b-9b9a-c649b0f19e0d req-965ec404-d6f4-444a-8288-224152d2e30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.231 227317 DEBUG oslo_concurrency.lockutils [req-238cfcd5-5855-4d7b-9b9a-c649b0f19e0d req-965ec404-d6f4-444a-8288-224152d2e30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.231 227317 DEBUG nova.compute.manager [req-238cfcd5-5855-4d7b-9b9a-c649b0f19e0d req-965ec404-d6f4-444a-8288-224152d2e30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Processing event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.294 143929 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.295 143929 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6gby9mr6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.174 230329 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.179 230329 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.181 230329 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.181 230329 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230329#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.297 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5f72e082-be2f-4718-9dcc-d8828b3a8309]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.372 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450792.3715491, 76fb1ebb-6b94-4c1e-96b6-352821eff2cc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.372 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] VM Started (Lifecycle Event)#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.374 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.378 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.381 227317 INFO nova.virt.libvirt.driver [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Instance spawned successfully.#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.382 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.428 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.431 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.451 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.451 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450792.3717995, 76fb1ebb-6b94-4c1e-96b6-352821eff2cc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.451 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.466 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.467 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.467 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.467 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.468 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.468 227317 DEBUG nova.virt.libvirt.driver [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.473 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.476 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450792.377461, 76fb1ebb-6b94-4c1e-96b6-352821eff2cc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.476 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.499 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.502 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.524 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.544 227317 INFO nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Took 23.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.545 227317 DEBUG nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:32.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.616 227317 INFO nova.compute.manager [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Took 24.49 seconds to build instance.#033[00m
Jan 26 13:06:32 np0005596062 nova_compute[227313]: 2026-01-26 18:06:32.635 227317 DEBUG oslo_concurrency.lockutils [None req-9373a014-e8cb-476e-b3bc-a0550da424b1 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 26 13:06:32 np0005596062 podman[230376]: 2026-01-26 18:06:32.920583403 +0000 UTC m=+0.125171491 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.926 230329 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.926 230329 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:32.927 230329 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:33.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.628 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[044ac672-6211-47df-a24d-e25e05f476b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.629 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0233ae30-21 in ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.632 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0233ae30-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.633 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[85ddf293-38ab-4f87-85ce-8f26a2fc8ffb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.639 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7edfa57c-7fa8-47b9-9376-5bfafa535a66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.668 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[70283af0-6a1b-45a1-8c78-83a22be38c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.697 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[50325ab9-551f-4531-b079-4ab586fea31a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:33.699 143929 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmps0xi3wgo/privsep.sock']#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.401 227317 DEBUG nova.compute.manager [req-35ced95a-83e5-4d3b-840a-eaf42f90b099 req-1409283c-495b-4bb9-a2f3-9b1a8eebba12 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.401 227317 DEBUG oslo_concurrency.lockutils [req-35ced95a-83e5-4d3b-840a-eaf42f90b099 req-1409283c-495b-4bb9-a2f3-9b1a8eebba12 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.402 227317 DEBUG oslo_concurrency.lockutils [req-35ced95a-83e5-4d3b-840a-eaf42f90b099 req-1409283c-495b-4bb9-a2f3-9b1a8eebba12 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.402 227317 DEBUG oslo_concurrency.lockutils [req-35ced95a-83e5-4d3b-840a-eaf42f90b099 req-1409283c-495b-4bb9-a2f3-9b1a8eebba12 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.403 227317 DEBUG nova.compute.manager [req-35ced95a-83e5-4d3b-840a-eaf42f90b099 req-1409283c-495b-4bb9-a2f3-9b1a8eebba12 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] No waiting events found dispatching network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.403 227317 WARNING nova.compute.manager [req-35ced95a-83e5-4d3b-840a-eaf42f90b099 req-1409283c-495b-4bb9-a2f3-9b1a8eebba12 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received unexpected event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.428 143929 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.429 143929 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmps0xi3wgo/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.314 230412 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.320 230412 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.324 230412 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.324 230412 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230412#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.433 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6ed3c5-594b-4664-bc3b-efd408b224b4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.521 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:06:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:34.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:06:34 np0005596062 nova_compute[227313]: 2026-01-26 18:06:34.756 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.981 230412 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.981 230412 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:34.981 230412 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:35.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.598 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[52131170-8db6-4410-9cb3-c68d47e29bd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.626 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[3eec1f55-5bdd-4076-9d3e-08d2e783dd9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 NetworkManager[48993]: <info>  [1769450795.6295] manager: (tap0233ae30-20): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 26 13:06:35 np0005596062 systemd-udevd[230424]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.661 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bec0f8-c625-470e-a0bf-57801aaa2802]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.665 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[d7bb62d3-5d69-437d-9665-f6fe5d5850c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 NetworkManager[48993]: <info>  [1769450795.6888] device (tap0233ae30-20): carrier: link connected
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.695 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[42c4d335-37ba-4170-bb78-467527f7d73b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.720 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[05f8af14-7ecc-45e4-b763-14a2a26b8542]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0233ae30-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:f4:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455573, 'reachable_time': 29997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230442, 'error': None, 'target': 'ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.737 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7932fbde-d1d5-48d3-89a6-a55016b29366]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:f458'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455573, 'tstamp': 455573}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230443, 'error': None, 'target': 'ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.754 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[35319856-9b89-4d28-9b53-a01295d0f51f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0233ae30-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:f4:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455573, 'reachable_time': 29997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230444, 'error': None, 'target': 'ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.789 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd901aa-1d86-415e-97a7-b3a81a2a249d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.875 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[714fbc62-e3a7-4542-bff6-ba967a4e5853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.879 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0233ae30-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.880 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.880 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0233ae30-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:35 np0005596062 kernel: tap0233ae30-20: entered promiscuous mode
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.885 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0233ae30-20, col_values=(('external_ids', {'iface-id': '21642513-87eb-404c-8f9f-3b78ea6c1c25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:35Z|00031|binding|INFO|Releasing lport 21642513-87eb-404c-8f9f-3b78ea6c1c25 from this chassis (sb_readonly=0)
Jan 26 13:06:35 np0005596062 nova_compute[227313]: 2026-01-26 18:06:35.884 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:35 np0005596062 NetworkManager[48993]: <info>  [1769450795.8920] manager: (tap0233ae30-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.906 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0233ae30-2e5a-4e12-9142-37047ec40cce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0233ae30-2e5a-4e12-9142-37047ec40cce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.907 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6d108d3d-4f94-4517-8a84-5c73aefea0dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:35 np0005596062 nova_compute[227313]: 2026-01-26 18:06:35.906 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.908 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-0233ae30-2e5a-4e12-9142-37047ec40cce
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/0233ae30-2e5a-4e12-9142-37047ec40cce.pid.haproxy
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 0233ae30-2e5a-4e12-9142-37047ec40cce
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:06:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:35.909 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce', 'env', 'PROCESS_TAG=haproxy-0233ae30-2e5a-4e12-9142-37047ec40cce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0233ae30-2e5a-4e12-9142-37047ec40cce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:06:36 np0005596062 podman[230478]: 2026-01-26 18:06:36.329991452 +0000 UTC m=+0.062830353 container create 6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 13:06:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:36 np0005596062 systemd[1]: Started libpod-conmon-6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472.scope.
Jan 26 13:06:36 np0005596062 podman[230478]: 2026-01-26 18:06:36.294714009 +0000 UTC m=+0.027552930 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:06:36 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:06:36 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc9358c68d4b6be9632bb19c220c75a990615b1c17383e340267d4480691a175/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:06:36 np0005596062 podman[230478]: 2026-01-26 18:06:36.428889597 +0000 UTC m=+0.161728518 container init 6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 13:06:36 np0005596062 podman[230478]: 2026-01-26 18:06:36.441049959 +0000 UTC m=+0.173888890 container start 6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 13:06:36 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [NOTICE]   (230498) : New worker (230500) forked
Jan 26 13:06:36 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [NOTICE]   (230498) : Loading success.
Jan 26 13:06:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:06:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:36.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:06:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:37.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:38.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 26 13:06:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:39.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:39 np0005596062 nova_compute[227313]: 2026-01-26 18:06:39.524 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:39 np0005596062 nova_compute[227313]: 2026-01-26 18:06:39.758 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.234 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.235 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.237 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.237 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.237 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.240 227317 INFO nova.compute.manager [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Terminating instance#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.241 227317 DEBUG nova.compute.manager [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:06:40 np0005596062 kernel: tap36635dd9-db (unregistering): left promiscuous mode
Jan 26 13:06:40 np0005596062 NetworkManager[48993]: <info>  [1769450800.2966] device (tap36635dd9-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:06:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:40Z|00032|binding|INFO|Releasing lport 36635dd9-db93-4788-a953-82e84680a474 from this chassis (sb_readonly=0)
Jan 26 13:06:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:40Z|00033|binding|INFO|Setting lport 36635dd9-db93-4788-a953-82e84680a474 down in Southbound
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.310 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:06:40Z|00034|binding|INFO|Removing iface tap36635dd9-db ovn-installed in OVS
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.314 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.320 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:26:2b 10.1.0.4 fdfe:381f:8400::5'], port_security=['fa:16:3e:32:26:2b 10.1.0.4 fdfe:381f:8400::5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.4/26 fdfe:381f:8400::5/64', 'neutron:device_id': '76fb1ebb-6b94-4c1e-96b6-352821eff2cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0233ae30-2e5a-4e12-9142-37047ec40cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0edb4019e89c4674848ec75122984916', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b8b977b7-e75f-401b-bfd0-7066aad28c16', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4f9fdf8-90b6-44b5-be73-6e7a7109730a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=36635dd9-db93-4788-a953-82e84680a474) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.322 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 36635dd9-db93-4788-a953-82e84680a474 in datapath 0233ae30-2e5a-4e12-9142-37047ec40cce unbound from our chassis#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.323 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0233ae30-2e5a-4e12-9142-37047ec40cce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.325 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a64d384d-de6c-40e4-872f-b56e9efd442c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.325 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce namespace which is not needed anymore#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.334 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 26 13:06:40 np0005596062 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 8.811s CPU time.
Jan 26 13:06:40 np0005596062 systemd-machined[195380]: Machine qemu-1-instance-00000002 terminated.
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.469 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.477 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.484 227317 INFO nova.virt.libvirt.driver [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Instance destroyed successfully.#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.484 227317 DEBUG nova.objects.instance [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lazy-loading 'resources' on Instance uuid 76fb1ebb-6b94-4c1e-96b6-352821eff2cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.504 227317 DEBUG nova.virt.libvirt.vif [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1013927775-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1013927775-1',id=2,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:06:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0edb4019e89c4674848ec75122984916',ramdisk_id='',reservation_id='r-872mjeag',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-1369791216',owner_user_name='tempest-AutoAllocateNetworkTest-1369791216-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:06:32Z,user_data=None,user_id='44d840a696d1433d91d7424baebdfd6b',uuid=76fb1ebb-6b94-4c1e-96b6-352821eff2cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.506 227317 DEBUG nova.network.os_vif_util [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Converting VIF {"id": "36635dd9-db93-4788-a953-82e84680a474", "address": "fa:16:3e:32:26:2b", "network": {"id": "0233ae30-2e5a-4e12-9142-37047ec40cce", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0edb4019e89c4674848ec75122984916", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36635dd9-db", "ovs_interfaceid": "36635dd9-db93-4788-a953-82e84680a474", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.507 227317 DEBUG nova.network.os_vif_util [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.507 227317 DEBUG os_vif [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.509 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.509 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36635dd9-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.511 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.515 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:06:40 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [NOTICE]   (230498) : haproxy version is 2.8.14-c23fe91
Jan 26 13:06:40 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [NOTICE]   (230498) : path to executable is /usr/sbin/haproxy
Jan 26 13:06:40 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [WARNING]  (230498) : Exiting Master process...
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.521 227317 INFO os_vif [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:26:2b,bridge_name='br-int',has_traffic_filtering=True,id=36635dd9-db93-4788-a953-82e84680a474,network=Network(0233ae30-2e5a-4e12-9142-37047ec40cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36635dd9-db')#033[00m
Jan 26 13:06:40 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [ALERT]    (230498) : Current worker (230500) exited with code 143 (Terminated)
Jan 26 13:06:40 np0005596062 neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce[230494]: [WARNING]  (230498) : All workers exited. Exiting... (0)
Jan 26 13:06:40 np0005596062 systemd[1]: libpod-6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472.scope: Deactivated successfully.
Jan 26 13:06:40 np0005596062 podman[230653]: 2026-01-26 18:06:40.53119731 +0000 UTC m=+0.082095012 container died 6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:06:40 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472-userdata-shm.mount: Deactivated successfully.
Jan 26 13:06:40 np0005596062 systemd[1]: var-lib-containers-storage-overlay-cc9358c68d4b6be9632bb19c220c75a990615b1c17383e340267d4480691a175-merged.mount: Deactivated successfully.
Jan 26 13:06:40 np0005596062 podman[230653]: 2026-01-26 18:06:40.577034562 +0000 UTC m=+0.127932284 container cleanup 6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:06:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:40.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.600 227317 DEBUG nova.compute.manager [req-ce42f1f6-d763-4ef2-8e62-42ce2b5c7d3d req-a3bce02d-6d20-4626-8c40-105ab0c2088a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-vif-unplugged-36635dd9-db93-4788-a953-82e84680a474 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:06:40 np0005596062 systemd[1]: libpod-conmon-6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472.scope: Deactivated successfully.
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.602 227317 DEBUG oslo_concurrency.lockutils [req-ce42f1f6-d763-4ef2-8e62-42ce2b5c7d3d req-a3bce02d-6d20-4626-8c40-105ab0c2088a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.603 227317 DEBUG oslo_concurrency.lockutils [req-ce42f1f6-d763-4ef2-8e62-42ce2b5c7d3d req-a3bce02d-6d20-4626-8c40-105ab0c2088a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.603 227317 DEBUG oslo_concurrency.lockutils [req-ce42f1f6-d763-4ef2-8e62-42ce2b5c7d3d req-a3bce02d-6d20-4626-8c40-105ab0c2088a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.604 227317 DEBUG nova.compute.manager [req-ce42f1f6-d763-4ef2-8e62-42ce2b5c7d3d req-a3bce02d-6d20-4626-8c40-105ab0c2088a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] No waiting events found dispatching network-vif-unplugged-36635dd9-db93-4788-a953-82e84680a474 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.604 227317 DEBUG nova.compute.manager [req-ce42f1f6-d763-4ef2-8e62-42ce2b5c7d3d req-a3bce02d-6d20-4626-8c40-105ab0c2088a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-vif-unplugged-36635dd9-db93-4788-a953-82e84680a474 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:06:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:06:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:06:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:06:40 np0005596062 podman[230718]: 2026-01-26 18:06:40.664181626 +0000 UTC m=+0.058050886 container remove 6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.673 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[76d3939e-528a-40c3-8d91-2d32ee2570de]: (4, ('Mon Jan 26 06:06:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce (6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472)\n6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472\nMon Jan 26 06:06:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce (6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472)\n6b97639fa826a8bd208c0c425ae765f83a62549ffda56958e6e0d6f54bf95472\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.675 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4a0a4e-f7c6-458d-b250-2c2a8a5a9a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.677 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0233ae30-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.680 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 kernel: tap0233ae30-20: left promiscuous mode
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.682 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.688 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb62a39-e9ef-412e-a78f-26f967ebe5ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.693 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.705 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a5dc007c-c290-4e76-bdc8-6f51355b3895]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.706 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[66a06614-bab5-4220-b9c4-60343d759c40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.730 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e136ed5a-a2e8-490c-812e-41ada247db85]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455564, 'reachable_time': 33780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230734, 'error': None, 'target': 'ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 systemd[1]: run-netns-ovnmeta\x2d0233ae30\x2d2e5a\x2d4e12\x2d9142\x2d37047ec40cce.mount: Deactivated successfully.
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.746 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0233ae30-2e5a-4e12-9142-37047ec40cce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:06:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:40.748 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2f08a6-2e5f-4e11-a7d3-e848324d59bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.940 227317 INFO nova.virt.libvirt.driver [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Deleting instance files /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc_del#033[00m
Jan 26 13:06:40 np0005596062 nova_compute[227313]: 2026-01-26 18:06:40.941 227317 INFO nova.virt.libvirt.driver [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Deletion of /var/lib/nova/instances/76fb1ebb-6b94-4c1e-96b6-352821eff2cc_del complete#033[00m
Jan 26 13:06:41 np0005596062 nova_compute[227313]: 2026-01-26 18:06:41.000 227317 DEBUG nova.virt.libvirt.host [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 26 13:06:41 np0005596062 nova_compute[227313]: 2026-01-26 18:06:41.000 227317 INFO nova.virt.libvirt.host [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] UEFI support detected#033[00m
Jan 26 13:06:41 np0005596062 nova_compute[227313]: 2026-01-26 18:06:41.003 227317 INFO nova.compute.manager [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:06:41 np0005596062 nova_compute[227313]: 2026-01-26 18:06:41.004 227317 DEBUG oslo.service.loopingcall [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:06:41 np0005596062 nova_compute[227313]: 2026-01-26 18:06:41.004 227317 DEBUG nova.compute.manager [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:06:41 np0005596062 nova_compute[227313]: 2026-01-26 18:06:41.005 227317 DEBUG nova.network.neutron [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:06:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:41.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:42.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.680 227317 DEBUG nova.network.neutron [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.703 227317 INFO nova.compute.manager [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Took 1.70 seconds to deallocate network for instance.#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.726 227317 DEBUG nova.compute.manager [req-76bba44c-518a-4c32-a599-8ef2041a9323 req-fc9536f1-648a-4be7-a348-26e247051c56 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.727 227317 DEBUG oslo_concurrency.lockutils [req-76bba44c-518a-4c32-a599-8ef2041a9323 req-fc9536f1-648a-4be7-a348-26e247051c56 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.727 227317 DEBUG oslo_concurrency.lockutils [req-76bba44c-518a-4c32-a599-8ef2041a9323 req-fc9536f1-648a-4be7-a348-26e247051c56 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.727 227317 DEBUG oslo_concurrency.lockutils [req-76bba44c-518a-4c32-a599-8ef2041a9323 req-fc9536f1-648a-4be7-a348-26e247051c56 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.727 227317 DEBUG nova.compute.manager [req-76bba44c-518a-4c32-a599-8ef2041a9323 req-fc9536f1-648a-4be7-a348-26e247051c56 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] No waiting events found dispatching network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.728 227317 WARNING nova.compute.manager [req-76bba44c-518a-4c32-a599-8ef2041a9323 req-fc9536f1-648a-4be7-a348-26e247051c56 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received unexpected event network-vif-plugged-36635dd9-db93-4788-a953-82e84680a474 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.774 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.774 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:42 np0005596062 nova_compute[227313]: 2026-01-26 18:06:42.835 227317 DEBUG oslo_concurrency.processutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:06:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:43.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:06:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:06:43 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1930089457' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.324 227317 DEBUG oslo_concurrency.processutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.332 227317 DEBUG nova.compute.provider_tree [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.391 227317 ERROR nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] [req-022e55bf-8cb5-4474-9207-24a61ad7097e] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 65600a65-69bc-488c-8c8c-71cbf43e523a.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-022e55bf-8cb5-4474-9207-24a61ad7097e"}]}#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.425 227317 DEBUG nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.460 227317 DEBUG nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.461 227317 DEBUG nova.compute.provider_tree [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.472 227317 DEBUG nova.compute.manager [req-fd43a240-6ab5-4273-9002-4589f16c9f5c req-7b1e6a77-0ec5-4628-a363-4551de074240 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Received event network-vif-deleted-36635dd9-db93-4788-a953-82e84680a474 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.484 227317 DEBUG nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.509 227317 DEBUG nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.542 227317 DEBUG oslo_concurrency.processutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:06:43 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2703769135' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:06:43 np0005596062 nova_compute[227313]: 2026-01-26 18:06:43.997 227317 DEBUG oslo_concurrency.processutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.004 227317 DEBUG nova.compute.provider_tree [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.084 227317 DEBUG nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updated inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.084 227317 DEBUG nova.compute.provider_tree [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updating resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.085 227317 DEBUG nova.compute.provider_tree [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.125 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.158 227317 INFO nova.scheduler.client.report [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Deleted allocations for instance 76fb1ebb-6b94-4c1e-96b6-352821eff2cc#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.223 227317 DEBUG oslo_concurrency.lockutils [None req-a6a18c9f-dbc8-4af5-ab7d-19766755599b 44d840a696d1433d91d7424baebdfd6b 0edb4019e89c4674848ec75122984916 - - default default] Lock "76fb1ebb-6b94-4c1e-96b6-352821eff2cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.988s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:44 np0005596062 nova_compute[227313]: 2026-01-26 18:06:44.580 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:44.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:45.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:45 np0005596062 nova_compute[227313]: 2026-01-26 18:06:45.512 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:46.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:06:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:47.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:06:48 np0005596062 nova_compute[227313]: 2026-01-26 18:06:48.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:48 np0005596062 nova_compute[227313]: 2026-01-26 18:06:48.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:06:48 np0005596062 nova_compute[227313]: 2026-01-26 18:06:48.072 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:06:48 np0005596062 nova_compute[227313]: 2026-01-26 18:06:48.073 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:48 np0005596062 nova_compute[227313]: 2026-01-26 18:06:48.073 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:06:48 np0005596062 nova_compute[227313]: 2026-01-26 18:06:48.090 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:48.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:49.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.192 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "6c22556a-6e41-4192-be2e-22694f2c2069" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.193 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "6c22556a-6e41-4192-be2e-22694f2c2069" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.213 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.296 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.297 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.306 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.307 227317 INFO nova.compute.claims [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.569 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.581 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:49 np0005596062 nova_compute[227313]: 2026-01-26 18:06:49.708 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.101 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:06:50 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4170640299' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.157 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.166 227317 DEBUG nova.compute.provider_tree [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.184 227317 DEBUG nova.scheduler.client.report [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.222 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.223 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.273 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.274 227317 DEBUG nova.network.neutron [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.295 227317 INFO nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.313 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.394 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.396 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.397 227317 INFO nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Creating image(s)#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.429 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.476 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.507 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.513 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.531 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.588 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.589 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.589 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.590 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:50.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.613 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:50 np0005596062 nova_compute[227313]: 2026-01-26 18:06:50.617 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 6c22556a-6e41-4192-be2e-22694f2c2069_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.015 227317 DEBUG nova.network.neutron [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.016 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:51.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:51 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:51.082 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:06:51 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:06:51.084 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.086 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:06:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.350 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 6c22556a-6e41-4192-be2e-22694f2c2069_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.732s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.438 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] resizing rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.561 227317 DEBUG nova.objects.instance [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lazy-loading 'migration_context' on Instance uuid 6c22556a-6e41-4192-be2e-22694f2c2069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.580 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.581 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Ensure instance console log exists: /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.583 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.583 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.583 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.585 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.590 227317 WARNING nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.596 227317 DEBUG nova.virt.libvirt.host [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.596 227317 DEBUG nova.virt.libvirt.host [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.600 227317 DEBUG nova.virt.libvirt.host [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.600 227317 DEBUG nova.virt.libvirt.host [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.602 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.602 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.602 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.603 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.603 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.603 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.603 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.603 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.603 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.604 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.604 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.604 227317 DEBUG nova.virt.hardware [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:06:51 np0005596062 nova_compute[227313]: 2026-01-26 18:06:51.607 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:06:52 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/269338043' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.094 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.095 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.095 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.095 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.096 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.123 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.156 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.162 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:06:52 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2618085126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.577 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:06:52 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1059260938' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:06:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:06:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:52.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.623 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.625 227317 DEBUG nova.objects.instance [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c22556a-6e41-4192-be2e-22694f2c2069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.640 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <uuid>6c22556a-6e41-4192-be2e-22694f2c2069</uuid>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <name>instance-00000004</name>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:name>tempest-LiveMigrationNegativeTest-server-392763921</nova:name>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:06:51</nova:creationTime>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:user uuid="de05c1206cfc4993b2bcdda77b98b4cb">tempest-LiveMigrationNegativeTest-1527322899-project-member</nova:user>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <nova:project uuid="33e9cd5847344a4bb04467bbf6ff221c">tempest-LiveMigrationNegativeTest-1527322899</nova:project>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <nova:ports/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <entry name="serial">6c22556a-6e41-4192-be2e-22694f2c2069</entry>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <entry name="uuid">6c22556a-6e41-4192-be2e-22694f2c2069</entry>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/6c22556a-6e41-4192-be2e-22694f2c2069_disk">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/6c22556a-6e41-4192-be2e-22694f2c2069_disk.config">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/console.log" append="off"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:06:52 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:06:52 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:06:52 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:06:52 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.694 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.694 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.695 227317 INFO nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Using config drive#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.723 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.828 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.830 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4966MB free_disk=20.94280242919922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.831 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.831 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.943 227317 INFO nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Creating config drive at /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/disk.config#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.948 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hp2ezqi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.977 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 6c22556a-6e41-4192-be2e-22694f2c2069 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.978 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:06:52 np0005596062 nova_compute[227313]: 2026-01-26 18:06:52.978 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:06:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:53.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.085 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.108 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hp2ezqi" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.148 227317 DEBUG nova.storage.rbd_utils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] rbd image 6c22556a-6e41-4192-be2e-22694f2c2069_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.154 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/disk.config 6c22556a-6e41-4192-be2e-22694f2c2069_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:06:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:06:53 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/160237053' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.571 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.576 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.591 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.613 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:06:53 np0005596062 nova_compute[227313]: 2026-01-26 18:06:53.613 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:53 np0005596062 podman[231238]: 2026-01-26 18:06:53.893542166 +0000 UTC m=+0.092207250 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.583 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.612 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.613 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.613 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:06:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:54.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.632 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.632 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.633 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.634 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.922 227317 DEBUG oslo_concurrency.processutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/disk.config 6c22556a-6e41-4192-be2e-22694f2c2069_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.768s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:06:54 np0005596062 nova_compute[227313]: 2026-01-26 18:06:54.922 227317 INFO nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Deleting local config drive /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069/disk.config because it was imported into RBD.#033[00m
Jan 26 13:06:55 np0005596062 systemd-machined[195380]: New machine qemu-2-instance-00000004.
Jan 26 13:06:55 np0005596062 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 26 13:06:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:55.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.482 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769450800.4810796, 76fb1ebb-6b94-4c1e-96b6-352821eff2cc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.485 227317 INFO nova.compute.manager [-] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.506 227317 DEBUG nova.compute.manager [None req-2dac169b-cf23-41d7-a000-b5ac3a8d3701 - - - - - -] [instance: 76fb1ebb-6b94-4c1e-96b6-352821eff2cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.533 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.621 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450815.6205747, 6c22556a-6e41-4192-be2e-22694f2c2069 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.622 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.625 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.625 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.629 227317 INFO nova.virt.libvirt.driver [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Instance spawned successfully.#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.630 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.661 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.667 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.667 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.668 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.668 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.669 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.669 227317 DEBUG nova.virt.libvirt.driver [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.674 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.734 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.735 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450815.6249266, 6c22556a-6e41-4192-be2e-22694f2c2069 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.735 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] VM Started (Lifecycle Event)#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.741 227317 INFO nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Took 5.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.742 227317 DEBUG nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.764 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.768 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.794 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.810 227317 INFO nova.compute.manager [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Took 6.54 seconds to build instance.#033[00m
Jan 26 13:06:55 np0005596062 nova_compute[227313]: 2026-01-26 18:06:55.825 227317 DEBUG oslo_concurrency.lockutils [None req-6275dc89-7a56-467a-a1b9-3ab019ae3ca3 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "6c22556a-6e41-4192-be2e-22694f2c2069" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.517 227317 DEBUG nova.objects.instance [None req-3ad4d93a-0a62-4c6d-9a63-a076eddb4d88 422accd613a64b818c0200e00879ba23 a6ae56dd7fd94819a9f44ec6b9ae6696 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c22556a-6e41-4192-be2e-22694f2c2069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.541 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450816.540496, 6c22556a-6e41-4192-be2e-22694f2c2069 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.541 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.561 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.566 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.584 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 26 13:06:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:56.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:56 np0005596062 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 26 13:06:56 np0005596062 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 1.587s CPU time.
Jan 26 13:06:56 np0005596062 systemd-machined[195380]: Machine qemu-2-instance-00000004 terminated.
Jan 26 13:06:56 np0005596062 nova_compute[227313]: 2026-01-26 18:06:56.863 227317 DEBUG nova.compute.manager [None req-3ad4d93a-0a62-4c6d-9a63-a076eddb4d88 422accd613a64b818c0200e00879ba23 a6ae56dd7fd94819a9f44ec6b9ae6696 - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:06:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:06:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:57.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:06:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:06:58.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:06:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:06:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:06:59.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.201 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "6c22556a-6e41-4192-be2e-22694f2c2069" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.202 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "6c22556a-6e41-4192-be2e-22694f2c2069" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.202 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "6c22556a-6e41-4192-be2e-22694f2c2069-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.203 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "6c22556a-6e41-4192-be2e-22694f2c2069-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.203 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "6c22556a-6e41-4192-be2e-22694f2c2069-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.204 227317 INFO nova.compute.manager [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Terminating instance#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.205 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "refresh_cache-6c22556a-6e41-4192-be2e-22694f2c2069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.205 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquired lock "refresh_cache-6c22556a-6e41-4192-be2e-22694f2c2069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.205 227317 DEBUG nova.network.neutron [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.586 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:06:59 np0005596062 nova_compute[227313]: 2026-01-26 18:06:59.963 227317 DEBUG nova.network.neutron [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:07:00 np0005596062 nova_compute[227313]: 2026-01-26 18:07:00.332 227317 DEBUG nova.network.neutron [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:07:00 np0005596062 nova_compute[227313]: 2026-01-26 18:07:00.348 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Releasing lock "refresh_cache-6c22556a-6e41-4192-be2e-22694f2c2069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:07:00 np0005596062 nova_compute[227313]: 2026-01-26 18:07:00.349 227317 DEBUG nova.compute.manager [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:07:00 np0005596062 nova_compute[227313]: 2026-01-26 18:07:00.358 227317 INFO nova.virt.libvirt.driver [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Instance destroyed successfully.#033[00m
Jan 26 13:07:00 np0005596062 nova_compute[227313]: 2026-01-26 18:07:00.359 227317 DEBUG nova.objects.instance [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lazy-loading 'resources' on Instance uuid 6c22556a-6e41-4192-be2e-22694f2c2069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:07:00 np0005596062 nova_compute[227313]: 2026-01-26 18:07:00.562 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:07:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:00.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:07:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:01.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:01 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:01.086 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.601 227317 INFO nova.virt.libvirt.driver [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Deleting instance files /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069_del#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.602 227317 INFO nova.virt.libvirt.driver [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Deletion of /var/lib/nova/instances/6c22556a-6e41-4192-be2e-22694f2c2069_del complete#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.686 227317 INFO nova.compute.manager [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Took 1.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.688 227317 DEBUG oslo.service.loopingcall [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.688 227317 DEBUG nova.compute.manager [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.689 227317 DEBUG nova.network.neutron [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.964 227317 DEBUG nova.network.neutron [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:07:01 np0005596062 nova_compute[227313]: 2026-01-26 18:07:01.988 227317 DEBUG nova.network.neutron [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.015 227317 INFO nova.compute.manager [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Took 0.33 seconds to deallocate network for instance.#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.063 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.064 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.113 227317 DEBUG oslo_concurrency.processutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:07:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2730867059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.618 227317 DEBUG oslo_concurrency.processutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.625 227317 DEBUG nova.compute.provider_tree [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:07:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:02.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.644 227317 DEBUG nova.scheduler.client.report [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.669 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.693 227317 INFO nova.scheduler.client.report [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Deleted allocations for instance 6c22556a-6e41-4192-be2e-22694f2c2069#033[00m
Jan 26 13:07:02 np0005596062 nova_compute[227313]: 2026-01-26 18:07:02.787 227317 DEBUG oslo_concurrency.lockutils [None req-1df74d19-61ae-4e18-89c2-a7d957a7c884 de05c1206cfc4993b2bcdda77b98b4cb 33e9cd5847344a4bb04467bbf6ff221c - - default default] Lock "6c22556a-6e41-4192-be2e-22694f2c2069" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:03.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:03 np0005596062 podman[231370]: 2026-01-26 18:07:03.899419904 +0000 UTC m=+0.108456619 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:07:04 np0005596062 nova_compute[227313]: 2026-01-26 18:07:04.589 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:04.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:05.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:05 np0005596062 nova_compute[227313]: 2026-01-26 18:07:05.564 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:06.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:08.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:09.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:09.155 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:09.156 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:09.156 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:09 np0005596062 nova_compute[227313]: 2026-01-26 18:07:09.591 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:10 np0005596062 nova_compute[227313]: 2026-01-26 18:07:10.566 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:10.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:11.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:11 np0005596062 nova_compute[227313]: 2026-01-26 18:07:11.865 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769450816.8633292, 6c22556a-6e41-4192-be2e-22694f2c2069 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:07:11 np0005596062 nova_compute[227313]: 2026-01-26 18:07:11.866 227317 INFO nova.compute.manager [-] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:07:12 np0005596062 nova_compute[227313]: 2026-01-26 18:07:12.322 227317 DEBUG nova.compute.manager [None req-a0b825ea-fff5-4c1c-9387-9d122972b713 - - - - - -] [instance: 6c22556a-6e41-4192-be2e-22694f2c2069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:07:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:12.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:13.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:14 np0005596062 nova_compute[227313]: 2026-01-26 18:07:14.593 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:14.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:15.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:15 np0005596062 nova_compute[227313]: 2026-01-26 18:07:15.568 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:16.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:17.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:07:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:18.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:07:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:19.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:19 np0005596062 nova_compute[227313]: 2026-01-26 18:07:19.595 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:20 np0005596062 nova_compute[227313]: 2026-01-26 18:07:20.570 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:20.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:21.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:22.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:23.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:23 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:23Z|00035|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 13:07:24 np0005596062 nova_compute[227313]: 2026-01-26 18:07:24.598 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:24.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:24 np0005596062 podman[231457]: 2026-01-26 18:07:24.904230433 +0000 UTC m=+0.101636180 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:07:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:25.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.173 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.174 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.196 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.322 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.323 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.330 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.331 227317 INFO nova.compute.claims [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.504 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:25 np0005596062 nova_compute[227313]: 2026-01-26 18:07:25.572 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:07:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4119684355' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.009 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.017 227317 DEBUG nova.compute.provider_tree [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.034 227317 DEBUG nova.scheduler.client.report [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.065 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.066 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.174 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.175 227317 DEBUG nova.network.neutron [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.215 227317 INFO nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.252 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.388 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.390 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.391 227317 INFO nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Creating image(s)#033[00m
Jan 26 13:07:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.508 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.546 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.580 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.584 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.639 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.640 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.641 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.641 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:07:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:26.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.675 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.680 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:26 np0005596062 nova_compute[227313]: 2026-01-26 18:07:26.989 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.310s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.094 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] resizing rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:07:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:27.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.232 227317 DEBUG nova.objects.instance [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lazy-loading 'migration_context' on Instance uuid e40120ae-eb4e-4f0b-9d8f-f0210de78c4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.249 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.249 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Ensure instance console log exists: /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.251 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.251 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.251 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:27 np0005596062 nova_compute[227313]: 2026-01-26 18:07:27.491 227317 DEBUG nova.policy [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e3f505042e7463683259f02e8e59eca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1f2cad350784d7eae39fc23fb032500', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:07:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:07:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:28.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:07:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:29.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:29 np0005596062 nova_compute[227313]: 2026-01-26 18:07:29.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:30 np0005596062 nova_compute[227313]: 2026-01-26 18:07:30.576 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:30.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.042 227317 DEBUG nova.network.neutron [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Successfully updated port: 06538465-e309-4216-af1a-244565d3805b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.082 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.083 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquired lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.083 227317 DEBUG nova.network.neutron [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:07:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:31.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.395 227317 DEBUG nova.compute.manager [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-changed-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.395 227317 DEBUG nova.compute.manager [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Refreshing instance network info cache due to event network-changed-06538465-e309-4216-af1a-244565d3805b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.396 227317 DEBUG oslo_concurrency.lockutils [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:07:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:31 np0005596062 nova_compute[227313]: 2026-01-26 18:07:31.508 227317 DEBUG nova.network.neutron [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:07:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:32.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.949 227317 DEBUG nova.network.neutron [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updating instance_info_cache with network_info: [{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.975 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Releasing lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.975 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Instance network_info: |[{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.976 227317 DEBUG oslo_concurrency.lockutils [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.976 227317 DEBUG nova.network.neutron [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Refreshing network info cache for port 06538465-e309-4216-af1a-244565d3805b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.978 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Start _get_guest_xml network_info=[{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.983 227317 WARNING nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.987 227317 DEBUG nova.virt.libvirt.host [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.988 227317 DEBUG nova.virt.libvirt.host [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.990 227317 DEBUG nova.virt.libvirt.host [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.991 227317 DEBUG nova.virt.libvirt.host [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.992 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.992 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.993 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.993 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.993 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.993 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.993 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.994 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.994 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.994 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.994 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.995 227317 DEBUG nova.virt.hardware [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:07:32 np0005596062 nova_compute[227313]: 2026-01-26 18:07:32.997 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:33.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:07:33 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3174275240' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:07:33 np0005596062 nova_compute[227313]: 2026-01-26 18:07:33.472 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:33 np0005596062 nova_compute[227313]: 2026-01-26 18:07:33.513 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:33 np0005596062 nova_compute[227313]: 2026-01-26 18:07:33.517 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:07:34 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3233351083' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.037 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.040 227317 DEBUG nova.virt.libvirt.vif [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1296850176',display_name='tempest-LiveMigrationTest-server-1296850176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1296850176',id=5,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-02y9chrd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:07:26Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=e40120ae-eb4e-4f0b-9d8f-f0210de78c4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.041 227317 DEBUG nova.network.os_vif_util [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converting VIF {"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.043 227317 DEBUG nova.network.os_vif_util [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.045 227317 DEBUG nova.objects.instance [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lazy-loading 'pci_devices' on Instance uuid e40120ae-eb4e-4f0b-9d8f-f0210de78c4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.080 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <uuid>e40120ae-eb4e-4f0b-9d8f-f0210de78c4f</uuid>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <name>instance-00000005</name>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:name>tempest-LiveMigrationTest-server-1296850176</nova:name>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:07:32</nova:creationTime>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:user uuid="9e3f505042e7463683259f02e8e59eca">tempest-LiveMigrationTest-877386369-project-member</nova:user>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:project uuid="b1f2cad350784d7eae39fc23fb032500">tempest-LiveMigrationTest-877386369</nova:project>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <nova:port uuid="06538465-e309-4216-af1a-244565d3805b">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <entry name="serial">e40120ae-eb4e-4f0b-9d8f-f0210de78c4f</entry>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <entry name="uuid">e40120ae-eb4e-4f0b-9d8f-f0210de78c4f</entry>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk.config">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:35:48:ae"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <target dev="tap06538465-e3"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/console.log" append="off"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:07:34 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:07:34 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:07:34 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:07:34 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.082 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Preparing to wait for external event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.082 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.083 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.083 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.085 227317 DEBUG nova.virt.libvirt.vif [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1296850176',display_name='tempest-LiveMigrationTest-server-1296850176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1296850176',id=5,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-02y9chrd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:07:26Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=e40120ae-eb4e-4f0b-9d8f-f0210de78c4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.085 227317 DEBUG nova.network.os_vif_util [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converting VIF {"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.086 227317 DEBUG nova.network.os_vif_util [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.087 227317 DEBUG os_vif [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.089 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.089 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.090 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.095 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.095 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06538465-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.096 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06538465-e3, col_values=(('external_ids', {'iface-id': '06538465-e309-4216-af1a-244565d3805b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:48:ae', 'vm-uuid': 'e40120ae-eb4e-4f0b-9d8f-f0210de78c4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:34 np0005596062 NetworkManager[48993]: <info>  [1769450854.1006] manager: (tap06538465-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.098 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.103 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.110 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.112 227317 INFO os_vif [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3')#033[00m
Jan 26 13:07:34 np0005596062 podman[231785]: 2026-01-26 18:07:34.280015012 +0000 UTC m=+0.119678770 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.339 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.340 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.340 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] No VIF found with MAC fa:16:3e:35:48:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.341 227317 INFO nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Using config drive#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.378 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:34 np0005596062 nova_compute[227313]: 2026-01-26 18:07:34.602 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:34.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.014 227317 INFO nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Creating config drive at /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/disk.config#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.023 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp888pclmv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:35.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.166 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp888pclmv" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.210 227317 DEBUG nova.storage.rbd_utils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.216 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/disk.config e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.340 227317 DEBUG nova.network.neutron [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updated VIF entry in instance network info cache for port 06538465-e309-4216-af1a-244565d3805b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.342 227317 DEBUG nova.network.neutron [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updating instance_info_cache with network_info: [{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.368 227317 DEBUG oslo_concurrency.lockutils [req-be6e2cd1-8f10-477f-b083-d29d6d9240d2 req-059b7254-58c6-44f1-ba14-be25f829fee3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.419 227317 DEBUG oslo_concurrency.processutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/disk.config e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.420 227317 INFO nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Deleting local config drive /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f/disk.config because it was imported into RBD.#033[00m
Jan 26 13:07:35 np0005596062 kernel: tap06538465-e3: entered promiscuous mode
Jan 26 13:07:35 np0005596062 NetworkManager[48993]: <info>  [1769450855.4890] manager: (tap06538465-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.489 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00036|binding|INFO|Claiming lport 06538465-e309-4216-af1a-244565d3805b for this chassis.
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00037|binding|INFO|06538465-e309-4216-af1a-244565d3805b: Claiming fa:16:3e:35:48:ae 10.100.0.14
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00038|binding|INFO|Claiming lport 8efebc34-f8eb-42e5-af94-78e84c0dcbba for this chassis.
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00039|binding|INFO|8efebc34-f8eb-42e5-af94-78e84c0dcbba: Claiming fa:16:3e:c6:69:fa 19.80.0.72
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.493 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.502 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.521 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:48:ae 10.100.0.14'], port_security=['fa:16:3e:35:48:ae 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1321931442', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e40120ae-eb4e-4f0b-9d8f-f0210de78c4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1321931442', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=06538465-e309-4216-af1a-244565d3805b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.523 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:69:fa 19.80.0.72'], port_security=['fa:16:3e:c6:69:fa 19.80.0.72'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['06538465-e309-4216-af1a-244565d3805b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2075617635', 'neutron:cidrs': '19.80.0.72/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2075617635', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=75dd0954-cbf3-4a3e-a6ef-19fcd101cc5d, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8efebc34-f8eb-42e5-af94-78e84c0dcbba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.524 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 06538465-e309-4216-af1a-244565d3805b in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b bound to our chassis#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.526 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0516cc55-93b8-4bf2-b595-d07702fa255b#033[00m
Jan 26 13:07:35 np0005596062 systemd-machined[195380]: New machine qemu-3-instance-00000005.
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.543 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[83b7d571-b080-40fb-aae5-3c769c9932c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.544 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0516cc55-91 in ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.546 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0516cc55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.546 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac7611f-972f-4ab0-bc7e-b4c73027a496]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.547 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ab344a04-ab3a-4770-8b7d-faff1460d360]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.565 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[70791a90-3d36-41dc-bfb9-60ca142719e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.595 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0de7d2-0e06-41f0-aa35-22a048e4b00b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.632 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00040|binding|INFO|Setting lport 06538465-e309-4216-af1a-244565d3805b ovn-installed in OVS
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00041|binding|INFO|Setting lport 06538465-e309-4216-af1a-244565d3805b up in Southbound
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00042|binding|INFO|Setting lport 8efebc34-f8eb-42e5-af94-78e84c0dcbba up in Southbound
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.639 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.642 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[493daaa7-6d3a-445f-bea4-97d87513f0c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.647 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9b77c2-66f2-416d-ba65-b7b9ed6a01d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 NetworkManager[48993]: <info>  [1769450855.6494] manager: (tap0516cc55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 26 13:07:35 np0005596062 systemd-udevd[231889]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:07:35 np0005596062 systemd-udevd[231890]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:07:35 np0005596062 NetworkManager[48993]: <info>  [1769450855.6828] device (tap06538465-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:07:35 np0005596062 NetworkManager[48993]: <info>  [1769450855.6839] device (tap06538465-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.687 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[95a19559-1450-44f1-8509-58ee4023be04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.690 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f681b23b-a7bc-413a-8129-b0fc1d027e4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 NetworkManager[48993]: <info>  [1769450855.7153] device (tap0516cc55-90): carrier: link connected
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.718 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[99b04c7f-3245-48d1-a02a-a6c9eb3ea7b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.737 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[57316553-4764-42d6-957e-2f5e6e577ee9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0516cc55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:40:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461576, 'reachable_time': 28180, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231916, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.756 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[07833d03-0cba-4b26-8dbf-40ba45ddec64]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:40ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461576, 'tstamp': 461576}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231917, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.777 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[70bc162c-9d7e-4fac-a928-29df44cbd5a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0516cc55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:40:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461576, 'reachable_time': 28180, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231918, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.856 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7595641a-1366-4972-90f6-a14689f4b8e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.925 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[202d01bc-7c53-4acc-a695-dcc7f75cae78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.927 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0516cc55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.927 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.928 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0516cc55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 NetworkManager[48993]: <info>  [1769450855.9305] manager: (tap0516cc55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 26 13:07:35 np0005596062 kernel: tap0516cc55-90: entered promiscuous mode
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.933 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0516cc55-90, col_values=(('external_ids', {'iface-id': '46cfbba6-430a-495c-9d6a-60cf58c877d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.934 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:35Z|00043|binding|INFO|Releasing lport 46cfbba6-430a-495c-9d6a-60cf58c877d3 from this chassis (sb_readonly=0)
Jan 26 13:07:35 np0005596062 nova_compute[227313]: 2026-01-26 18:07:35.954 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.955 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.956 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2999f001-eaa2-47d1-a554-aa6c5db36922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.956 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-0516cc55-93b8-4bf2-b595-d07702fa255b
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 0516cc55-93b8-4bf2-b595-d07702fa255b
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:07:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:35.957 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'env', 'PROCESS_TAG=haproxy-0516cc55-93b8-4bf2-b595-d07702fa255b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0516cc55-93b8-4bf2-b595-d07702fa255b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.083 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450856.0828564, e40120ae-eb4e-4f0b-9d8f-f0210de78c4f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.084 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] VM Started (Lifecycle Event)#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.123 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.137 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450856.0829713, e40120ae-eb4e-4f0b-9d8f-f0210de78c4f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.137 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.163 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.167 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.196 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.243 227317 DEBUG nova.compute.manager [req-9a5530dd-76d9-4ae7-b9d5-938dc0e171cb req-38709f37-1937-4228-872c-c35da387a132 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.244 227317 DEBUG oslo_concurrency.lockutils [req-9a5530dd-76d9-4ae7-b9d5-938dc0e171cb req-38709f37-1937-4228-872c-c35da387a132 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.245 227317 DEBUG oslo_concurrency.lockutils [req-9a5530dd-76d9-4ae7-b9d5-938dc0e171cb req-38709f37-1937-4228-872c-c35da387a132 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.245 227317 DEBUG oslo_concurrency.lockutils [req-9a5530dd-76d9-4ae7-b9d5-938dc0e171cb req-38709f37-1937-4228-872c-c35da387a132 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.246 227317 DEBUG nova.compute.manager [req-9a5530dd-76d9-4ae7-b9d5-938dc0e171cb req-38709f37-1937-4228-872c-c35da387a132 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Processing event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.247 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.250 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450856.2503598, e40120ae-eb4e-4f0b-9d8f-f0210de78c4f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.251 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.253 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.259 227317 INFO nova.virt.libvirt.driver [-] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Instance spawned successfully.#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.261 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.287 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.297 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.301 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.302 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.303 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.304 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.304 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.306 227317 DEBUG nova.virt.libvirt.driver [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.330 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.404 227317 INFO nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Took 10.01 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.405 227317 DEBUG nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:07:36 np0005596062 podman[231994]: 2026-01-26 18:07:36.437820453 +0000 UTC m=+0.081302121 container create 48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 13:07:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:36 np0005596062 systemd[1]: Started libpod-conmon-48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60.scope.
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.487 227317 INFO nova.compute.manager [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Took 11.21 seconds to build instance.#033[00m
Jan 26 13:07:36 np0005596062 podman[231994]: 2026-01-26 18:07:36.396571087 +0000 UTC m=+0.040052785 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:07:36 np0005596062 nova_compute[227313]: 2026-01-26 18:07:36.502 227317 DEBUG oslo_concurrency.lockutils [None req-69ec9256-5e63-4018-819e-cc0ba2d0fc1c 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:36 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:07:36 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41f07daad5ceef7449ac1fb8869b5ac8dbc03632619eacf02dd7c7f63b5cf736/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:07:36 np0005596062 podman[231994]: 2026-01-26 18:07:36.536283128 +0000 UTC m=+0.179764776 container init 48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:07:36 np0005596062 podman[231994]: 2026-01-26 18:07:36.54613323 +0000 UTC m=+0.189614848 container start 48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:07:36 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [NOTICE]   (232013) : New worker (232015) forked
Jan 26 13:07:36 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [NOTICE]   (232013) : Loading success.
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.606 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 8efebc34-f8eb-42e5-af94-78e84c0dcbba in datapath ebb9e0b4-8385-462a-84cc-87c6f72c0c65 unbound from our chassis#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.611 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ebb9e0b4-8385-462a-84cc-87c6f72c0c65#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.623 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[89288fc0-2302-4248-be8e-70a717317cec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.625 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapebb9e0b4-81 in ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.627 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapebb9e0b4-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.627 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e91ec4-38b7-465f-b907-5a467b635645]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.629 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bc074d01-c5fb-4407-9651-bdce703044fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.643 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[31a876c9-a341-4d74-89d9-7b74dbba09f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.666 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4eda0a43-b491-48c6-89c9-bb8f9aadd894]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:36.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.702 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[e6754935-168c-4ad2-b5fb-fb3a089549dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.710 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[78610b00-4361-4e85-8504-10cf4f9b0930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 NetworkManager[48993]: <info>  [1769450856.7111] manager: (tapebb9e0b4-80): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.742 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[45b88c83-32e3-4f27-9753-e51f976240f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.752 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b4f206-be71-4e69-a3ac-c8091c0927b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 NetworkManager[48993]: <info>  [1769450856.7773] device (tapebb9e0b4-80): carrier: link connected
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.785 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[64a6bb8e-93c1-4988-b9d1-e4e5f4d16e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.811 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c4fc0eb6-8ac9-4149-a937-5c8ece45d7e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapebb9e0b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:af:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461682, 'reachable_time': 19784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232034, 'error': None, 'target': 'ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.840 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6a7250-3a62-45fc-aa29-a56657638a10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:af9c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461682, 'tstamp': 461682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232035, 'error': None, 'target': 'ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.872 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ab74b318-52e6-43fa-ae2f-b441633160d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapebb9e0b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:af:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461682, 'reachable_time': 19784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232036, 'error': None, 'target': 'ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.918 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5efe0558-19d4-4428-b89e-ace4073b91a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:36.999 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdffa5c-18a0-4938-a1b2-892f73afac30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.001 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebb9e0b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.002 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.003 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebb9e0b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:37 np0005596062 NetworkManager[48993]: <info>  [1769450857.0071] manager: (tapebb9e0b4-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 26 13:07:37 np0005596062 kernel: tapebb9e0b4-80: entered promiscuous mode
Jan 26 13:07:37 np0005596062 nova_compute[227313]: 2026-01-26 18:07:37.009 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.018 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapebb9e0b4-80, col_values=(('external_ids', {'iface-id': 'ec5ab65e-333c-4443-bd37-b74fa484479e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:37 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:37Z|00044|binding|INFO|Releasing lport ec5ab65e-333c-4443-bd37-b74fa484479e from this chassis (sb_readonly=0)
Jan 26 13:07:37 np0005596062 nova_compute[227313]: 2026-01-26 18:07:37.021 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:37 np0005596062 nova_compute[227313]: 2026-01-26 18:07:37.050 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.053 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ebb9e0b4-8385-462a-84cc-87c6f72c0c65.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ebb9e0b4-8385-462a-84cc-87c6f72c0c65.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.054 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[16b063c9-f50d-4926-a72d-49a921706207]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.055 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-ebb9e0b4-8385-462a-84cc-87c6f72c0c65
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/ebb9e0b4-8385-462a-84cc-87c6f72c0c65.pid.haproxy
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID ebb9e0b4-8385-462a-84cc-87c6f72c0c65
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:07:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:37.056 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'env', 'PROCESS_TAG=haproxy-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ebb9e0b4-8385-462a-84cc-87c6f72c0c65.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:07:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:37 np0005596062 podman[232069]: 2026-01-26 18:07:37.495830093 +0000 UTC m=+0.058562036 container create 4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 13:07:37 np0005596062 systemd[1]: Started libpod-conmon-4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882.scope.
Jan 26 13:07:37 np0005596062 podman[232069]: 2026-01-26 18:07:37.462236331 +0000 UTC m=+0.024968344 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:07:37 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:07:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28ea82d1afebcdb69351f1bb7046eeda26c54ae63dd41906ceea6a3e11b346d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:07:37 np0005596062 podman[232069]: 2026-01-26 18:07:37.597280858 +0000 UTC m=+0.160012871 container init 4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 13:07:37 np0005596062 podman[232069]: 2026-01-26 18:07:37.604710615 +0000 UTC m=+0.167442588 container start 4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:07:37 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [NOTICE]   (232089) : New worker (232091) forked
Jan 26 13:07:37 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [NOTICE]   (232089) : Loading success.
Jan 26 13:07:38 np0005596062 nova_compute[227313]: 2026-01-26 18:07:38.480 227317 DEBUG nova.compute.manager [req-7441dcca-a06d-43de-a34e-eeab09e85fc0 req-31b56bac-ed00-4ff9-9bdc-7b8f9c7c9f99 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:38 np0005596062 nova_compute[227313]: 2026-01-26 18:07:38.481 227317 DEBUG oslo_concurrency.lockutils [req-7441dcca-a06d-43de-a34e-eeab09e85fc0 req-31b56bac-ed00-4ff9-9bdc-7b8f9c7c9f99 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:38 np0005596062 nova_compute[227313]: 2026-01-26 18:07:38.482 227317 DEBUG oslo_concurrency.lockutils [req-7441dcca-a06d-43de-a34e-eeab09e85fc0 req-31b56bac-ed00-4ff9-9bdc-7b8f9c7c9f99 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:38 np0005596062 nova_compute[227313]: 2026-01-26 18:07:38.482 227317 DEBUG oslo_concurrency.lockutils [req-7441dcca-a06d-43de-a34e-eeab09e85fc0 req-31b56bac-ed00-4ff9-9bdc-7b8f9c7c9f99 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:38 np0005596062 nova_compute[227313]: 2026-01-26 18:07:38.482 227317 DEBUG nova.compute.manager [req-7441dcca-a06d-43de-a34e-eeab09e85fc0 req-31b56bac-ed00-4ff9-9bdc-7b8f9c7c9f99 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-plugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:07:38 np0005596062 nova_compute[227313]: 2026-01-26 18:07:38.483 227317 WARNING nova.compute.manager [req-7441dcca-a06d-43de-a34e-eeab09e85fc0 req-31b56bac-ed00-4ff9-9bdc-7b8f9c7c9f99 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received unexpected event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b for instance with vm_state active and task_state None.#033[00m
Jan 26 13:07:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:38.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:39 np0005596062 nova_compute[227313]: 2026-01-26 18:07:39.100 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:39.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:39 np0005596062 nova_compute[227313]: 2026-01-26 18:07:39.606 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:40.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:41.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:42.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:43.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:44 np0005596062 nova_compute[227313]: 2026-01-26 18:07:44.103 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:44 np0005596062 nova_compute[227313]: 2026-01-26 18:07:44.323 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Check if temp file /var/lib/nova/instances/tmpdzsz1mp9 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 26 13:07:44 np0005596062 nova_compute[227313]: 2026-01-26 18:07:44.324 227317 DEBUG nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdzsz1mp9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e40120ae-eb4e-4f0b-9d8f-f0210de78c4f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 26 13:07:44 np0005596062 nova_compute[227313]: 2026-01-26 18:07:44.606 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:45.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:46.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:47.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:47 np0005596062 nova_compute[227313]: 2026-01-26 18:07:47.347 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:07:47 np0005596062 nova_compute[227313]: 2026-01-26 18:07:47.348 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:07:47 np0005596062 nova_compute[227313]: 2026-01-26 18:07:47.357 227317 INFO nova.compute.rpcapi [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 26 13:07:47 np0005596062 nova_compute[227313]: 2026-01-26 18:07:47.359 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:07:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:48.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:49 np0005596062 nova_compute[227313]: 2026-01-26 18:07:49.108 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:49.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:49 np0005596062 nova_compute[227313]: 2026-01-26 18:07:49.609 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:07:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:50.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:07:51 np0005596062 nova_compute[227313]: 2026-01-26 18:07:51.067 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:51.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:51 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:51.245 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:07:51 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:51.247 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:07:51 np0005596062 nova_compute[227313]: 2026-01-26 18:07:51.286 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:52 np0005596062 nova_compute[227313]: 2026-01-26 18:07:52.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:52 np0005596062 nova_compute[227313]: 2026-01-26 18:07:52.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:52 np0005596062 nova_compute[227313]: 2026-01-26 18:07:52.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:52 np0005596062 nova_compute[227313]: 2026-01-26 18:07:52.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:07:52 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:52.249 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:52 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:52Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:48:ae 10.100.0.14
Jan 26 13:07:52 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:52Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:48:ae 10.100.0.14
Jan 26 13:07:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:07:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:52.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:07:53 np0005596062 nova_compute[227313]: 2026-01-26 18:07:53.047 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:53.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.106 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.107 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.107 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.107 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.108 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.126 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.554 227317 DEBUG nova.compute.manager [req-896a83f3-e3d7-4858-a6db-d7e581413b43 req-15b80c10-faa5-4878-9b29-d96a7c37735a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-unplugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.555 227317 DEBUG oslo_concurrency.lockutils [req-896a83f3-e3d7-4858-a6db-d7e581413b43 req-15b80c10-faa5-4878-9b29-d96a7c37735a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.555 227317 DEBUG oslo_concurrency.lockutils [req-896a83f3-e3d7-4858-a6db-d7e581413b43 req-15b80c10-faa5-4878-9b29-d96a7c37735a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.555 227317 DEBUG oslo_concurrency.lockutils [req-896a83f3-e3d7-4858-a6db-d7e581413b43 req-15b80c10-faa5-4878-9b29-d96a7c37735a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.555 227317 DEBUG nova.compute.manager [req-896a83f3-e3d7-4858-a6db-d7e581413b43 req-15b80c10-faa5-4878-9b29-d96a7c37735a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-unplugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.556 227317 DEBUG nova.compute.manager [req-896a83f3-e3d7-4858-a6db-d7e581413b43 req-15b80c10-faa5-4878-9b29-d96a7c37735a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-unplugged-06538465-e309-4216-af1a-244565d3805b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.610 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:07:54 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/327403103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.661 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:54.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.878 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:07:54 np0005596062 nova_compute[227313]: 2026-01-26 18:07:54.878 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000005 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.082 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.083 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4790MB free_disk=20.941871643066406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.083 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.083 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:07:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:55.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.180 227317 INFO nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updating resource usage from migration 1b877e7a-f025-4e3a-b89d-0d8bb1ffb592#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.247 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Migration 1b877e7a-f025-4e3a-b89d-0d8bb1ffb592 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.248 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.248 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.394 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.445388) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450875445926, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1191, "num_deletes": 251, "total_data_size": 2477527, "memory_usage": 2500672, "flush_reason": "Manual Compaction"}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450875461316, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1622611, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21805, "largest_seqno": 22990, "table_properties": {"data_size": 1617509, "index_size": 2562, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11582, "raw_average_key_size": 19, "raw_value_size": 1606949, "raw_average_value_size": 2765, "num_data_blocks": 115, "num_entries": 581, "num_filter_entries": 581, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450783, "oldest_key_time": 1769450783, "file_creation_time": 1769450875, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 15973 microseconds, and 6668 cpu microseconds.
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.461365) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1622611 bytes OK
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.461388) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.463908) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.463963) EVENT_LOG_v1 {"time_micros": 1769450875463934, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.463991) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 2471819, prev total WAL file size 2471819, number of live WAL files 2.
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.465641) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1584KB)], [42(8078KB)]
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450875465732, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 9895411, "oldest_snapshot_seqno": -1}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 4761 keys, 7858520 bytes, temperature: kUnknown
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450875513013, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 7858520, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7827070, "index_size": 18476, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11909, "raw_key_size": 119368, "raw_average_key_size": 25, "raw_value_size": 7741097, "raw_average_value_size": 1625, "num_data_blocks": 756, "num_entries": 4761, "num_filter_entries": 4761, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450875, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.513628) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 7858520 bytes
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.516295) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 207.8 rd, 165.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 7.9 +0.0 blob) out(7.5 +0.0 blob), read-write-amplify(10.9) write-amplify(4.8) OK, records in: 5281, records dropped: 520 output_compression: NoCompression
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.516318) EVENT_LOG_v1 {"time_micros": 1769450875516308, "job": 24, "event": "compaction_finished", "compaction_time_micros": 47631, "compaction_time_cpu_micros": 17840, "output_level": 6, "num_output_files": 1, "total_output_size": 7858520, "num_input_records": 5281, "num_output_records": 4761, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450875516765, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450875518638, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.465500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.518774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.518781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.518784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.518786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:07:55.518788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.663 227317 INFO nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Took 8.32 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.670 227317 DEBUG nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.721 227317 DEBUG nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdzsz1mp9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e40120ae-eb4e-4f0b-9d8f-f0210de78c4f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1b877e7a-f025-4e3a-b89d-0d8bb1ffb592),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.727 227317 DEBUG nova.objects.instance [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lazy-loading 'migration_context' on Instance uuid e40120ae-eb4e-4f0b-9d8f-f0210de78c4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.729 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.732 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.732 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.779 227317 DEBUG nova.virt.libvirt.vif [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1296850176',display_name='tempest-LiveMigrationTest-server-1296850176',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1296850176',id=5,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:07:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-02y9chrd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:07:36Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=e40120ae-eb4e-4f0b-9d8f-f0210de78c4f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.780 227317 DEBUG nova.network.os_vif_util [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converting VIF {"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.781 227317 DEBUG nova.network.os_vif_util [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.782 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 13:07:55 np0005596062 nova_compute[227313]:  <mac address="fa:16:3e:35:48:ae"/>
Jan 26 13:07:55 np0005596062 nova_compute[227313]:  <model type="virtio"/>
Jan 26 13:07:55 np0005596062 nova_compute[227313]:  <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:07:55 np0005596062 nova_compute[227313]:  <mtu size="1442"/>
Jan 26 13:07:55 np0005596062 nova_compute[227313]:  <target dev="tap06538465-e3"/>
Jan 26 13:07:55 np0005596062 nova_compute[227313]: </interface>
Jan 26 13:07:55 np0005596062 nova_compute[227313]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.783 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 26 13:07:55 np0005596062 podman[232336]: 2026-01-26 18:07:55.86382332 +0000 UTC m=+0.062791839 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:07:55 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1903678147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.890 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.897 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:07:55 np0005596062 nova_compute[227313]: 2026-01-26 18:07:55.923 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.013 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.014 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.236 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.237 227317 INFO nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 26 13:07:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:07:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:07:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.404 227317 INFO nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 26 13:07:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.698 227317 DEBUG nova.compute.manager [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.699 227317 DEBUG oslo_concurrency.lockutils [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.699 227317 DEBUG oslo_concurrency.lockutils [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.699 227317 DEBUG oslo_concurrency.lockutils [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.699 227317 DEBUG nova.compute.manager [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-plugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.699 227317 WARNING nova.compute.manager [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received unexpected event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.700 227317 DEBUG nova.compute.manager [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-changed-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.700 227317 DEBUG nova.compute.manager [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Refreshing instance network info cache due to event network-changed-06538465-e309-4216-af1a-244565d3805b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.700 227317 DEBUG oslo_concurrency.lockutils [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.700 227317 DEBUG oslo_concurrency.lockutils [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.700 227317 DEBUG nova.network.neutron [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Refreshing network info cache for port 06538465-e309-4216-af1a-244565d3805b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:07:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:56.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.907 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:07:56 np0005596062 nova_compute[227313]: 2026-01-26 18:07:56.908 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.013 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.014 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.014 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.142 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:07:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:57.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:07:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:07:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:07:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.411 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.412 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.779 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450877.7788906, e40120ae-eb4e-4f0b-9d8f-f0210de78c4f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.780 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.839 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.845 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.916 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.917 227317 DEBUG nova.virt.libvirt.migration [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 26 13:07:57 np0005596062 nova_compute[227313]: 2026-01-26 18:07:57.956 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 26 13:07:58 np0005596062 kernel: tap06538465-e3 (unregistering): left promiscuous mode
Jan 26 13:07:58 np0005596062 NetworkManager[48993]: <info>  [1769450878.0225] device (tap06538465-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.036 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00045|binding|INFO|Releasing lport 06538465-e309-4216-af1a-244565d3805b from this chassis (sb_readonly=0)
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00046|binding|INFO|Setting lport 06538465-e309-4216-af1a-244565d3805b down in Southbound
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00047|binding|INFO|Releasing lport 8efebc34-f8eb-42e5-af94-78e84c0dcbba from this chassis (sb_readonly=0)
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00048|binding|INFO|Setting lport 8efebc34-f8eb-42e5-af94-78e84c0dcbba down in Southbound
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00049|binding|INFO|Removing iface tap06538465-e3 ovn-installed in OVS
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.068 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00050|binding|INFO|Releasing lport 46cfbba6-430a-495c-9d6a-60cf58c877d3 from this chassis (sb_readonly=0)
Jan 26 13:07:58 np0005596062 ovn_controller[133984]: 2026-01-26T18:07:58Z|00051|binding|INFO|Releasing lport ec5ab65e-333c-4443-bd37-b74fa484479e from this chassis (sb_readonly=0)
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.079 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:48:ae 10.100.0.14'], port_security=['fa:16:3e:35:48:ae 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c76f2593-4bbb-4cef-b447-9e180245ada6'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1321931442', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e40120ae-eb4e-4f0b-9d8f-f0210de78c4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1321931442', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=06538465-e309-4216-af1a-244565d3805b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.081 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:69:fa 19.80.0.72'], port_security=['fa:16:3e:c6:69:fa 19.80.0.72'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['06538465-e309-4216-af1a-244565d3805b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2075617635', 'neutron:cidrs': '19.80.0.72/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2075617635', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '3', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=75dd0954-cbf3-4a3e-a6ef-19fcd101cc5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8efebc34-f8eb-42e5-af94-78e84c0dcbba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.083 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 06538465-e309-4216-af1a-244565d3805b in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b unbound from our chassis#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.084 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0516cc55-93b8-4bf2-b595-d07702fa255b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.085 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ee73ae-3e11-4f82-9db7-8a729846140b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.085 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b namespace which is not needed anymore#033[00m
Jan 26 13:07:58 np0005596062 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 26 13:07:58 np0005596062 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 15.865s CPU time.
Jan 26 13:07:58 np0005596062 systemd-machined[195380]: Machine qemu-3-instance-00000005 terminated.
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.176 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [NOTICE]   (232013) : haproxy version is 2.8.14-c23fe91
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [NOTICE]   (232013) : path to executable is /usr/sbin/haproxy
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [WARNING]  (232013) : Exiting Master process...
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [ALERT]    (232013) : Current worker (232015) exited with code 143 (Terminated)
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[232009]: [WARNING]  (232013) : All workers exited. Exiting... (0)
Jan 26 13:07:58 np0005596062 systemd[1]: libpod-48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60.scope: Deactivated successfully.
Jan 26 13:07:58 np0005596062 podman[232385]: 2026-01-26 18:07:58.290381919 +0000 UTC m=+0.094981204 container died 48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 26 13:07:58 np0005596062 virtqemud[226715]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk: No such file or directory
Jan 26 13:07:58 np0005596062 virtqemud[226715]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_disk: No such file or directory
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.369 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.369 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.370 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 26 13:07:58 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60-userdata-shm.mount: Deactivated successfully.
Jan 26 13:07:58 np0005596062 systemd[1]: var-lib-containers-storage-overlay-41f07daad5ceef7449ac1fb8869b5ac8dbc03632619eacf02dd7c7f63b5cf736-merged.mount: Deactivated successfully.
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.419 227317 DEBUG nova.virt.libvirt.guest [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'e40120ae-eb4e-4f0b-9d8f-f0210de78c4f' (instance-00000005) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.420 227317 INFO nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migration operation has completed#033[00m
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.420 227317 INFO nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] _post_live_migration() is started..#033[00m
Jan 26 13:07:58 np0005596062 podman[232385]: 2026-01-26 18:07:58.533916467 +0000 UTC m=+0.338515752 container cleanup 48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 13:07:58 np0005596062 systemd[1]: libpod-conmon-48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60.scope: Deactivated successfully.
Jan 26 13:07:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:07:58.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:58 np0005596062 podman[232428]: 2026-01-26 18:07:58.750949381 +0000 UTC m=+0.190475830 container remove 48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.758 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1d934d99-35e4-431a-97e2-235f5513b531]: (4, ('Mon Jan 26 06:07:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b (48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60)\n48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60\nMon Jan 26 06:07:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b (48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60)\n48eb159a1f30bb79e925d0783aa4c25aa5c0fc2f03e1a9827670cc44a6f97c60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.760 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[be3eeaf3-45d4-4954-a008-9874161c1d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.761 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0516cc55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.763 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:58 np0005596062 kernel: tap0516cc55-90: left promiscuous mode
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.780 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:58 np0005596062 nova_compute[227313]: 2026-01-26 18:07:58.781 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.784 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[529a553b-96ee-487b-a880-1929957f72ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.799 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[243dff0e-a1a1-471e-a4d6-703a390479a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.802 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d96404-8473-44d5-8b57-46af03dd4a3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.822 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[648d8f91-dad8-4094-84c0-f48c4cfc8bee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461568, 'reachable_time': 20776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232447, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.825 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.825 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[429d2e43-0adf-4503-b803-044768e4e14d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 systemd[1]: run-netns-ovnmeta\x2d0516cc55\x2d93b8\x2d4bf2\x2db595\x2dd07702fa255b.mount: Deactivated successfully.
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.826 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 8efebc34-f8eb-42e5-af94-78e84c0dcbba in datapath ebb9e0b4-8385-462a-84cc-87c6f72c0c65 unbound from our chassis#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.829 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ebb9e0b4-8385-462a-84cc-87c6f72c0c65, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.830 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[51e3b0a2-3595-4130-8388-82b7c45f8f9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:58 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:58.831 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65 namespace which is not needed anymore#033[00m
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [NOTICE]   (232089) : haproxy version is 2.8.14-c23fe91
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [NOTICE]   (232089) : path to executable is /usr/sbin/haproxy
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [WARNING]  (232089) : Exiting Master process...
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [WARNING]  (232089) : Exiting Master process...
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [ALERT]    (232089) : Current worker (232091) exited with code 143 (Terminated)
Jan 26 13:07:58 np0005596062 neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65[232085]: [WARNING]  (232089) : All workers exited. Exiting... (0)
Jan 26 13:07:58 np0005596062 systemd[1]: libpod-4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882.scope: Deactivated successfully.
Jan 26 13:07:58 np0005596062 podman[232466]: 2026-01-26 18:07:58.996997367 +0000 UTC m=+0.070102023 container died 4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 13:07:59 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882-userdata-shm.mount: Deactivated successfully.
Jan 26 13:07:59 np0005596062 systemd[1]: var-lib-containers-storage-overlay-28ea82d1afebcdb69351f1bb7046eeda26c54ae63dd41906ceea6a3e11b346d9-merged.mount: Deactivated successfully.
Jan 26 13:07:59 np0005596062 podman[232466]: 2026-01-26 18:07:59.035852958 +0000 UTC m=+0.108957614 container cleanup 4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 13:07:59 np0005596062 systemd[1]: libpod-conmon-4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882.scope: Deactivated successfully.
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.128 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:07:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:07:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:07:59.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:07:59 np0005596062 podman[232497]: 2026-01-26 18:07:59.194957924 +0000 UTC m=+0.138300414 container remove 4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.201 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1da9794b-1e94-4221-98d8-0275fa484091]: (4, ('Mon Jan 26 06:07:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65 (4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882)\n4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882\nMon Jan 26 06:07:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65 (4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882)\n4b426182cac41cee715cdec681bc12ddbc4eff3035888e26e79f11c309ce2882\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.203 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f4db1b76-abd2-43db-8d7f-4b2e8552a9c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.204 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebb9e0b4-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.206 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:59 np0005596062 kernel: tapebb9e0b4-80: left promiscuous mode
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.229 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.232 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9e14082e-2adf-4df8-9d7b-e34522236103]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.250 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d4047d-11e8-42a8-a2f4-1764ec89b785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.252 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a5888891-8112-484d-9d47-aab527026b63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.271 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a661b14b-b6c8-4ed2-bca3-78849b647a40]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461674, 'reachable_time': 29628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232513, 'error': None, 'target': 'ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.274 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ebb9e0b4-8385-462a-84cc-87c6f72c0c65 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:07:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:07:59.274 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1dcb9a-a5fb-405c-a5b9-eaa5c2ed4de4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:07:59 np0005596062 systemd[1]: run-netns-ovnmeta\x2debb9e0b4\x2d8385\x2d462a\x2d84cc\x2d87c6f72c0c65.mount: Deactivated successfully.
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.547 227317 DEBUG nova.compute.manager [req-b44fa650-6b20-4777-a4d3-e45ec9650dd5 req-bc3dff8a-6271-48b5-9d66-0aef40a1931d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-unplugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.549 227317 DEBUG oslo_concurrency.lockutils [req-b44fa650-6b20-4777-a4d3-e45ec9650dd5 req-bc3dff8a-6271-48b5-9d66-0aef40a1931d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.549 227317 DEBUG oslo_concurrency.lockutils [req-b44fa650-6b20-4777-a4d3-e45ec9650dd5 req-bc3dff8a-6271-48b5-9d66-0aef40a1931d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.550 227317 DEBUG oslo_concurrency.lockutils [req-b44fa650-6b20-4777-a4d3-e45ec9650dd5 req-bc3dff8a-6271-48b5-9d66-0aef40a1931d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.551 227317 DEBUG nova.compute.manager [req-b44fa650-6b20-4777-a4d3-e45ec9650dd5 req-bc3dff8a-6271-48b5-9d66-0aef40a1931d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-unplugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.551 227317 DEBUG nova.compute.manager [req-b44fa650-6b20-4777-a4d3-e45ec9650dd5 req-bc3dff8a-6271-48b5-9d66-0aef40a1931d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-unplugged-06538465-e309-4216-af1a-244565d3805b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:07:59 np0005596062 nova_compute[227313]: 2026-01-26 18:07:59.612 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:00.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:01.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.172 227317 DEBUG nova.network.neutron [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Activated binding for port 06538465-e309-4216-af1a-244565d3805b and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.173 227317 DEBUG nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.173 227317 DEBUG nova.virt.libvirt.vif [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1296850176',display_name='tempest-LiveMigrationTest-server-1296850176',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1296850176',id=5,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:07:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-02y9chrd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:07:42Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=e40120ae-eb4e-4f0b-9d8f-f0210de78c4f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.174 227317 DEBUG nova.network.os_vif_util [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converting VIF {"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.175 227317 DEBUG nova.network.os_vif_util [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.175 227317 DEBUG os_vif [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.177 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.177 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06538465-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.178 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.180 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.182 227317 INFO os_vif [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:48:ae,bridge_name='br-int',has_traffic_filtering=True,id=06538465-e309-4216-af1a-244565d3805b,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap06538465-e3')#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.182 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.183 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.183 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.183 227317 DEBUG nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.184 227317 INFO nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Deleting instance files /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_del#033[00m
Jan 26 13:08:01 np0005596062 nova_compute[227313]: 2026-01-26 18:08:01.184 227317 INFO nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Deletion of /var/lib/nova/instances/e40120ae-eb4e-4f0b-9d8f-f0210de78c4f_del complete#033[00m
Jan 26 13:08:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.164 227317 DEBUG nova.compute.manager [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.164 227317 DEBUG oslo_concurrency.lockutils [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.165 227317 DEBUG oslo_concurrency.lockutils [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.165 227317 DEBUG oslo_concurrency.lockutils [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.165 227317 DEBUG nova.compute.manager [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-plugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.165 227317 WARNING nova.compute.manager [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received unexpected event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.166 227317 DEBUG nova.compute.manager [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.166 227317 DEBUG oslo_concurrency.lockutils [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.166 227317 DEBUG oslo_concurrency.lockutils [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.166 227317 DEBUG oslo_concurrency.lockutils [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.167 227317 DEBUG nova.compute.manager [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-plugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.167 227317 WARNING nova.compute.manager [req-696cdffd-6be0-45b2-b888-849ff6813a0e req-fefbc3ac-47f3-490a-9b8a-a7005716967f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received unexpected event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:08:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:02.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.873 227317 DEBUG nova.network.neutron [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updated VIF entry in instance network info cache for port 06538465-e309-4216-af1a-244565d3805b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.873 227317 DEBUG nova.network.neutron [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updating instance_info_cache with network_info: [{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.972 227317 DEBUG oslo_concurrency.lockutils [req-124ce85f-64ad-46f8-af2d-22c221c347fe req-26360cc9-0c11-4364-8907-673d514e6fa3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.972 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.972 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:08:02 np0005596062 nova_compute[227313]: 2026-01-26 18:08:02.973 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e40120ae-eb4e-4f0b-9d8f-f0210de78c4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:08:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:03.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.682262) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450883682339, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 365, "num_deletes": 256, "total_data_size": 334115, "memory_usage": 342712, "flush_reason": "Manual Compaction"}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450883685289, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 220890, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22995, "largest_seqno": 23355, "table_properties": {"data_size": 218598, "index_size": 392, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5418, "raw_average_key_size": 17, "raw_value_size": 214005, "raw_average_value_size": 690, "num_data_blocks": 18, "num_entries": 310, "num_filter_entries": 310, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450875, "oldest_key_time": 1769450875, "file_creation_time": 1769450883, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 3053 microseconds, and 1277 cpu microseconds.
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.685317) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 220890 bytes OK
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.685329) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.686463) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.686473) EVENT_LOG_v1 {"time_micros": 1769450883686470, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.686485) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 331592, prev total WAL file size 331592, number of live WAL files 2.
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.686915) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(215KB)], [45(7674KB)]
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450883686997, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 8079410, "oldest_snapshot_seqno": -1}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4548 keys, 7944672 bytes, temperature: kUnknown
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450883744516, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 7944672, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7913963, "index_size": 18247, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 116202, "raw_average_key_size": 25, "raw_value_size": 7831068, "raw_average_value_size": 1721, "num_data_blocks": 742, "num_entries": 4548, "num_filter_entries": 4548, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769450883, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.744899) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 7944672 bytes
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.746303) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.9 rd, 137.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 7.5 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(72.5) write-amplify(36.0) OK, records in: 5071, records dropped: 523 output_compression: NoCompression
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.746319) EVENT_LOG_v1 {"time_micros": 1769450883746312, "job": 26, "event": "compaction_finished", "compaction_time_micros": 57737, "compaction_time_cpu_micros": 19215, "output_level": 6, "num_output_files": 1, "total_output_size": 7944672, "num_input_records": 5071, "num_output_records": 4548, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450883746448, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769450883747597, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.686829) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.747625) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.747629) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.747631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.747632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:08:03 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:08:03.747633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.472 227317 DEBUG nova.compute.manager [req-25e3f615-d759-453e-a02f-a3fe5e10a73d req-911c2429-4363-4be0-b2b0-6380031a5463 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.472 227317 DEBUG oslo_concurrency.lockutils [req-25e3f615-d759-453e-a02f-a3fe5e10a73d req-911c2429-4363-4be0-b2b0-6380031a5463 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.472 227317 DEBUG oslo_concurrency.lockutils [req-25e3f615-d759-453e-a02f-a3fe5e10a73d req-911c2429-4363-4be0-b2b0-6380031a5463 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.472 227317 DEBUG oslo_concurrency.lockutils [req-25e3f615-d759-453e-a02f-a3fe5e10a73d req-911c2429-4363-4be0-b2b0-6380031a5463 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.473 227317 DEBUG nova.compute.manager [req-25e3f615-d759-453e-a02f-a3fe5e10a73d req-911c2429-4363-4be0-b2b0-6380031a5463 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] No waiting events found dispatching network-vif-plugged-06538465-e309-4216-af1a-244565d3805b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.473 227317 WARNING nova.compute.manager [req-25e3f615-d759-453e-a02f-a3fe5e10a73d req-911c2429-4363-4be0-b2b0-6380031a5463 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Received unexpected event network-vif-plugged-06538465-e309-4216-af1a-244565d3805b for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:08:04 np0005596062 nova_compute[227313]: 2026-01-26 18:08:04.614 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:04.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:04 np0005596062 podman[232517]: 2026-01-26 18:08:04.873318109 +0000 UTC m=+0.078193008 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 13:08:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:05.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:08:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:08:06 np0005596062 nova_compute[227313]: 2026-01-26 18:08:06.179 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:06.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:07.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:08 np0005596062 nova_compute[227313]: 2026-01-26 18:08:08.325 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updating instance_info_cache with network_info: [{"id": "06538465-e309-4216-af1a-244565d3805b", "address": "fa:16:3e:35:48:ae", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06538465-e3", "ovs_interfaceid": "06538465-e309-4216-af1a-244565d3805b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:08:08 np0005596062 nova_compute[227313]: 2026-01-26 18:08:08.341 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-e40120ae-eb4e-4f0b-9d8f-f0210de78c4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:08:08 np0005596062 nova_compute[227313]: 2026-01-26 18:08:08.341 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:08:08 np0005596062 nova_compute[227313]: 2026-01-26 18:08:08.341 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:08.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:09.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:09.157 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:09.159 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:09.159 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:09 np0005596062 nova_compute[227313]: 2026-01-26 18:08:09.615 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:10.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:11.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:11 np0005596062 nova_compute[227313]: 2026-01-26 18:08:11.237 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.250 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.251 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.251 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "e40120ae-eb4e-4f0b-9d8f-f0210de78c4f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.324 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.325 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.325 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.325 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.325 227317 DEBUG oslo_concurrency.processutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:12.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:08:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2532738341' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:08:12 np0005596062 nova_compute[227313]: 2026-01-26 18:08:12.830 227317 DEBUG oslo_concurrency.processutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.050 227317 WARNING nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.052 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4903MB free_disk=20.897377014160156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.052 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.052 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:13.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.170 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Migration for instance e40120ae-eb4e-4f0b-9d8f-f0210de78c4f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.202 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.272 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Migration 1b877e7a-f025-4e3a-b89d-0d8bb1ffb592 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.272 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.273 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.340 227317 DEBUG oslo_concurrency.processutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.366 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769450878.3646564, e40120ae-eb4e-4f0b-9d8f-f0210de78c4f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.367 227317 INFO nova.compute.manager [-] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.397 227317 DEBUG nova.compute.manager [None req-ee02f06a-efbd-4185-bde3-692868ef0904 - - - - - -] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:08:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:08:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/94878428' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.810 227317 DEBUG oslo_concurrency.processutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.816 227317 DEBUG nova.compute.provider_tree [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.829 227317 DEBUG nova.scheduler.client.report [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.849 227317 DEBUG nova.compute.resource_tracker [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.850 227317 DEBUG oslo_concurrency.lockutils [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:13 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.856 227317 INFO nova.compute.manager [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 26 13:08:14 np0005596062 nova_compute[227313]: 2026-01-26 18:08:13.999 227317 INFO nova.scheduler.client.report [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Deleted allocation for migration 1b877e7a-f025-4e3a-b89d-0d8bb1ffb592#033[00m
Jan 26 13:08:14 np0005596062 nova_compute[227313]: 2026-01-26 18:08:14.000 227317 DEBUG nova.virt.libvirt.driver [None req-335d76ca-73c4-4e0c-8853-855fc0bca693 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: e40120ae-eb4e-4f0b-9d8f-f0210de78c4f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 26 13:08:14 np0005596062 nova_compute[227313]: 2026-01-26 18:08:14.617 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:14.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:15.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:16 np0005596062 nova_compute[227313]: 2026-01-26 18:08:16.239 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:16.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:17.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:18.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:19.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:19 np0005596062 nova_compute[227313]: 2026-01-26 18:08:19.620 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:20.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:21.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:21 np0005596062 nova_compute[227313]: 2026-01-26 18:08:21.243 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:22.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:23.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:24 np0005596062 nova_compute[227313]: 2026-01-26 18:08:24.624 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:24.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:25.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:26 np0005596062 nova_compute[227313]: 2026-01-26 18:08:26.245 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:26 np0005596062 podman[232724]: 2026-01-26 18:08:26.760652692 +0000 UTC m=+0.077649144 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 26 13:08:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:26.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:27.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:28.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:29 np0005596062 nova_compute[227313]: 2026-01-26 18:08:29.628 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:30.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:31.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:31 np0005596062 nova_compute[227313]: 2026-01-26 18:08:31.248 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:32.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:34 np0005596062 nova_compute[227313]: 2026-01-26 18:08:34.630 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:34.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:35 np0005596062 podman[232773]: 2026-01-26 18:08:35.914573958 +0000 UTC m=+0.121638482 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 26 13:08:36 np0005596062 nova_compute[227313]: 2026-01-26 18:08:36.250 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:36.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:37.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:38.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.066 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.066 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.105 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:08:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:39.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.267 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.268 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.284 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.285 227317 INFO nova.compute.claims [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.494 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.633 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:08:39 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2919102091' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.981 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:39 np0005596062 nova_compute[227313]: 2026-01-26 18:08:39.987 227317 DEBUG nova.compute.provider_tree [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.015 227317 DEBUG nova.scheduler.client.report [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.088 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.089 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.223 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.223 227317 DEBUG nova.network.neutron [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.259 227317 INFO nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.297 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.371 227317 INFO nova.virt.block_device [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Booting with volume b5b60a57-95c9-48f2-a72a-66b14f738be8 at /dev/vda#033[00m
Jan 26 13:08:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:08:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2247666425' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:08:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:08:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2247666425' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:08:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:40.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.992 227317 DEBUG os_brick.utils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 26 13:08:40 np0005596062 nova_compute[227313]: 2026-01-26 18:08:40.994 227317 INFO oslo.privsep.daemon [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp4fhg3b4b/privsep.sock']#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.027 227317 DEBUG nova.policy [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e3f505042e7463683259f02e8e59eca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1f2cad350784d7eae39fc23fb032500', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:08:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:41.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.252 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.718 227317 INFO oslo.privsep.daemon [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.610 232828 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.613 232828 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.615 232828 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.615 232828 INFO oslo.privsep.daemon [-] privsep daemon running as pid 232828#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.722 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[7b24808d-4f87-4de7-97d2-64a2db691101]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.813 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.829 232828 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.830 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[c76f6cbe-b1bb-472f-aaad-2a8405285755]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.832 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.847 232828 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.847 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e03eb3-4c38-4c7c-a7a8-0cabf7a422d1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:c828cff26df4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.849 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.864 232828 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.865 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[698f7d1b-3129-465d-9b50-91826ecb63bb]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.868 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[32e7dd85-d55a-466b-911d-7d667a95b423]: (4, '5c33c4b0-14ac-46af-8c94-d3bb1b6300af') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.869 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.903 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "nvme version" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.908 227317 DEBUG os_brick.initiator.connectors.lightos [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.909 227317 DEBUG os_brick.initiator.connectors.lightos [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.910 227317 DEBUG os_brick.initiator.connectors.lightos [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.910 227317 DEBUG os_brick.utils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] <== get_connector_properties: return (917ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:c828cff26df4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5c33c4b0-14ac-46af-8c94-d3bb1b6300af', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 26 13:08:41 np0005596062 nova_compute[227313]: 2026-01-26 18:08:41.911 227317 DEBUG nova.virt.block_device [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating existing volume attachment record: 5025e74b-c2b1-4272-a524-e7eeb678c73d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 26 13:08:42 np0005596062 nova_compute[227313]: 2026-01-26 18:08:42.673 227317 DEBUG nova.network.neutron [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Successfully created port: c3bd4b07-ea7b-40da-8a33-0ac219177512 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:08:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:43.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:44 np0005596062 nova_compute[227313]: 2026-01-26 18:08:44.633 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:44.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:45.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.905 227317 DEBUG nova.network.neutron [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Successfully updated port: c3bd4b07-ea7b-40da-8a33-0ac219177512 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.921 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.923 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.923 227317 INFO nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Creating image(s)#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.924 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.924 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Ensure instance console log exists: /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.925 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.925 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.925 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.929 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.929 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquired lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:08:45 np0005596062 nova_compute[227313]: 2026-01-26 18:08:45.929 227317 DEBUG nova.network.neutron [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:08:46 np0005596062 nova_compute[227313]: 2026-01-26 18:08:46.254 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:46 np0005596062 nova_compute[227313]: 2026-01-26 18:08:46.738 227317 DEBUG nova.compute.manager [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-changed-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:08:46 np0005596062 nova_compute[227313]: 2026-01-26 18:08:46.739 227317 DEBUG nova.compute.manager [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Refreshing instance network info cache due to event network-changed-c3bd4b07-ea7b-40da-8a33-0ac219177512. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:08:46 np0005596062 nova_compute[227313]: 2026-01-26 18:08:46.739 227317 DEBUG oslo_concurrency.lockutils [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:08:47 np0005596062 nova_compute[227313]: 2026-01-26 18:08:47.140 227317 DEBUG nova.network.neutron [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:08:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:47.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:08:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:08:47 np0005596062 ovn_controller[133984]: 2026-01-26T18:08:47Z|00052|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 13:08:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:49.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:49 np0005596062 nova_compute[227313]: 2026-01-26 18:08:49.636 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.284 227317 DEBUG nova.network.neutron [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating instance_info_cache with network_info: [{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.312 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Releasing lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.313 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Instance network_info: |[{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.314 227317 DEBUG oslo_concurrency.lockutils [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.315 227317 DEBUG nova.network.neutron [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Refreshing network info cache for port c3bd4b07-ea7b-40da-8a33-0ac219177512 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.320 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Start _get_guest_xml network_info=[{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/vda', 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'attachment_id': '5025e74b-c2b1-4272-a524-e7eeb678c73d', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b5b60a57-95c9-48f2-a72a-66b14f738be8', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b5b60a57-95c9-48f2-a72a-66b14f738be8', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'attached_at': '', 'detached_at': '', 'volume_id': 'b5b60a57-95c9-48f2-a72a-66b14f738be8', 'serial': 'b5b60a57-95c9-48f2-a72a-66b14f738be8'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.327 227317 WARNING nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.335 227317 DEBUG nova.virt.libvirt.host [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.335 227317 DEBUG nova.virt.libvirt.host [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.345 227317 DEBUG nova.virt.libvirt.host [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.345 227317 DEBUG nova.virt.libvirt.host [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.347 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.348 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.348 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.348 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.349 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.349 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.349 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.349 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.350 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.350 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.350 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.351 227317 DEBUG nova.virt.hardware [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.385 227317 DEBUG nova.storage.rbd_utils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image 4c4b2733-13a7-49fe-bbfb-f3e063298716_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.390 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:08:50 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1269408203' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.838 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.839 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.839 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.840 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.878 227317 DEBUG nova.virt.libvirt.vif [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-634605113',display_name='tempest-LiveMigrationTest-server-634605113',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-634605113',id=7,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-8pp60248',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:08:40Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.879 227317 DEBUG nova.network.os_vif_util [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converting VIF {"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.881 227317 DEBUG nova.network.os_vif_util [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.884 227317 DEBUG nova.objects.instance [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4c4b2733-13a7-49fe-bbfb-f3e063298716 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.914 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <uuid>4c4b2733-13a7-49fe-bbfb-f3e063298716</uuid>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <name>instance-00000007</name>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <nova:name>tempest-LiveMigrationTest-server-634605113</nova:name>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:08:50</nova:creationTime>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:user uuid="9e3f505042e7463683259f02e8e59eca">tempest-LiveMigrationTest-877386369-project-member</nova:user>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:project uuid="b1f2cad350784d7eae39fc23fb032500">tempest-LiveMigrationTest-877386369</nova:project>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <nova:port uuid="c3bd4b07-ea7b-40da-8a33-0ac219177512">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <entry name="serial">4c4b2733-13a7-49fe-bbfb-f3e063298716</entry>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <entry name="uuid">4c4b2733-13a7-49fe-bbfb-f3e063298716</entry>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/4c4b2733-13a7-49fe-bbfb-f3e063298716_disk.config">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="volumes/volume-b5b60a57-95c9-48f2-a72a-66b14f738be8">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <serial>b5b60a57-95c9-48f2-a72a-66b14f738be8</serial>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:89:24:36"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <target dev="tapc3bd4b07-ea"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/console.log" append="off"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:08:50 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:08:50 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:08:50 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:08:50 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.916 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Preparing to wait for external event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.916 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.916 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.916 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.917 227317 DEBUG nova.virt.libvirt.vif [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-634605113',display_name='tempest-LiveMigrationTest-server-634605113',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-634605113',id=7,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-8pp60248',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:08:40Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.917 227317 DEBUG nova.network.os_vif_util [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converting VIF {"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.918 227317 DEBUG nova.network.os_vif_util [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.918 227317 DEBUG os_vif [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.919 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.919 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.919 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.923 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.923 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3bd4b07-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.924 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3bd4b07-ea, col_values=(('external_ids', {'iface-id': 'c3bd4b07-ea7b-40da-8a33-0ac219177512', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:24:36', 'vm-uuid': '4c4b2733-13a7-49fe-bbfb-f3e063298716'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.926 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:50 np0005596062 NetworkManager[48993]: <info>  [1769450930.9279] manager: (tapc3bd4b07-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.929 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.934 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:50 np0005596062 nova_compute[227313]: 2026-01-26 18:08:50.935 227317 INFO os_vif [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea')#033[00m
Jan 26 13:08:51 np0005596062 nova_compute[227313]: 2026-01-26 18:08:51.015 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:08:51 np0005596062 nova_compute[227313]: 2026-01-26 18:08:51.015 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:08:51 np0005596062 nova_compute[227313]: 2026-01-26 18:08:51.016 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] No VIF found with MAC fa:16:3e:89:24:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:08:51 np0005596062 nova_compute[227313]: 2026-01-26 18:08:51.016 227317 INFO nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Using config drive#033[00m
Jan 26 13:08:51 np0005596062 nova_compute[227313]: 2026-01-26 18:08:51.046 227317 DEBUG nova.storage.rbd_utils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image 4c4b2733-13a7-49fe-bbfb-f3e063298716_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:08:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:08:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:51.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:08:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:52 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:52.365 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:08:52 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:52.366 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.366 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.375 227317 INFO nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Creating config drive at /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/disk.config#033[00m
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.381 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr9ppufqp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.521 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr9ppufqp" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.683 227317 DEBUG nova.storage.rbd_utils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] rbd image 4c4b2733-13a7-49fe-bbfb-f3e063298716_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:08:52 np0005596062 nova_compute[227313]: 2026-01-26 18:08:52.687 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/disk.config 4c4b2733-13a7-49fe-bbfb-f3e063298716_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.047 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.190 227317 DEBUG oslo_concurrency.processutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/disk.config 4c4b2733-13a7-49fe-bbfb-f3e063298716_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.191 227317 INFO nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deleting local config drive /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/disk.config because it was imported into RBD.#033[00m
Jan 26 13:08:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:53.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:53 np0005596062 kernel: tapc3bd4b07-ea: entered promiscuous mode
Jan 26 13:08:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:08:53Z|00053|binding|INFO|Claiming lport c3bd4b07-ea7b-40da-8a33-0ac219177512 for this chassis.
Jan 26 13:08:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:08:53Z|00054|binding|INFO|c3bd4b07-ea7b-40da-8a33-0ac219177512: Claiming fa:16:3e:89:24:36 10.100.0.12
Jan 26 13:08:53 np0005596062 NetworkManager[48993]: <info>  [1769450933.2433] manager: (tapc3bd4b07-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.244 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:53.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.248 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.253 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:24:36 10.100.0.12'], port_security=['fa:16:3e:89:24:36 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=c3bd4b07-ea7b-40da-8a33-0ac219177512) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.254 143929 INFO neutron.agent.ovn.metadata.agent [-] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b bound to our chassis#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.256 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0516cc55-93b8-4bf2-b595-d07702fa255b#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.271 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7fcdeb28-5a77-47e8-ad99-4bd2db617ae7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.272 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0516cc55-91 in ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.274 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0516cc55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.275 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d83fa1b4-98f5-4c25-89d0-0d1865be09b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 systemd-machined[195380]: New machine qemu-4-instance-00000007.
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.276 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c3aafd30-87c0-4736-9b08-745c5f27a7d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 systemd-udevd[233008]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.289 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[f94fb2b7-6db5-4063-8920-6e5488f776fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 NetworkManager[48993]: <info>  [1769450933.2991] device (tapc3bd4b07-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:08:53 np0005596062 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Jan 26 13:08:53 np0005596062 NetworkManager[48993]: <info>  [1769450933.3003] device (tapc3bd4b07-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.306 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:08:53Z|00055|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 ovn-installed in OVS
Jan 26 13:08:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:08:53Z|00056|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 up in Southbound
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.316 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.316 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f772a7-c80d-4077-af0a-44a4876a5068]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.349 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[c3fab5bf-a239-4576-9429-d24a7e4f185f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.354 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[313af6fc-1c8d-4525-8c31-5de5b485e8a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 NetworkManager[48993]: <info>  [1769450933.3560] manager: (tap0516cc55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.367 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.389 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[e0001386-8c28-4304-b713-001d699f7d85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.392 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[374a432b-cd51-4e84-af4e-d5afc66e26c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 NetworkManager[48993]: <info>  [1769450933.4156] device (tap0516cc55-90): carrier: link connected
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.422 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1005ec-f2b5-4831-972e-73e7d5dd7324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.438 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1b6ff4-67b1-4822-b619-335141ec8db5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0516cc55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:40:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469346, 'reachable_time': 20116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233040, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.453 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[163512c4-5f71-4b38-9e20-f0e9b0462cd8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:40ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469346, 'tstamp': 469346}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233041, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.468 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf143d1-c0fd-42e4-a87d-115d945eb047]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0516cc55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:40:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469346, 'reachable_time': 20116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233042, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.503 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[77d74b1e-0d9b-4682-b231-242cfe75e82d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.559 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cf297b4b-9593-499c-a544-b19a565117ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.561 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0516cc55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.561 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.561 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0516cc55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.563 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 NetworkManager[48993]: <info>  [1769450933.5638] manager: (tap0516cc55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 26 13:08:53 np0005596062 kernel: tap0516cc55-90: entered promiscuous mode
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.567 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0516cc55-90, col_values=(('external_ids', {'iface-id': '46cfbba6-430a-495c-9d6a-60cf58c877d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.568 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:08:53Z|00057|binding|INFO|Releasing lport 46cfbba6-430a-495c-9d6a-60cf58c877d3 from this chassis (sb_readonly=0)
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.581 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.582 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.583 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[75aab9c0-5bc1-4866-aef5-d0e044e6cdcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.583 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-0516cc55-93b8-4bf2-b595-d07702fa255b
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 0516cc55-93b8-4bf2-b595-d07702fa255b
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:08:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:08:53.584 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'env', 'PROCESS_TAG=haproxy-0516cc55-93b8-4bf2-b595-d07702fa255b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0516cc55-93b8-4bf2-b595-d07702fa255b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.920 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450933.9198532, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.921 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Started (Lifecycle Event)#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.940 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.944 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450933.9200575, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.944 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.974 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:08:53 np0005596062 nova_compute[227313]: 2026-01-26 18:08:53.977 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.006 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:08:54 np0005596062 podman[233115]: 2026-01-26 18:08:53.925637718 +0000 UTC m=+0.021356229 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:08:54 np0005596062 podman[233115]: 2026-01-26 18:08:54.103399529 +0000 UTC m=+0.199118050 container create debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 13:08:54 np0005596062 systemd[1]: Started libpod-conmon-debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33.scope.
Jan 26 13:08:54 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:08:54 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a58b7761373920f018a1da3ba20f4fb4a96a4aa42ec2f9713e82ef65831260/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:08:54 np0005596062 podman[233115]: 2026-01-26 18:08:54.242271557 +0000 UTC m=+0.337990088 container init debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 13:08:54 np0005596062 podman[233115]: 2026-01-26 18:08:54.247817704 +0000 UTC m=+0.343536245 container start debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 13:08:54 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [NOTICE]   (233136) : New worker (233138) forked
Jan 26 13:08:54 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [NOTICE]   (233136) : Loading success.
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.525 227317 DEBUG nova.network.neutron [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updated VIF entry in instance network info cache for port c3bd4b07-ea7b-40da-8a33-0ac219177512. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.525 227317 DEBUG nova.network.neutron [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating instance_info_cache with network_info: [{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.541 227317 DEBUG oslo_concurrency.lockutils [req-d1c76a93-2895-418b-b16b-baf30fd6d29a req-e16a4ce5-e20e-4a27-8308-d21a607b642f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:08:54 np0005596062 nova_compute[227313]: 2026-01-26 18:08:54.638 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.068 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.069 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.069 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:55.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:55.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:55 np0005596062 nova_compute[227313]: 2026-01-26 18:08:55.927 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.080 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.080 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.080 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.081 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.081 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:08:56 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4076085959' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.561 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.630 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.631 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.785 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.786 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4798MB free_disk=20.942779541015625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.786 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.786 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.861 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 4c4b2733-13a7-49fe-bbfb-f3e063298716 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.861 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.861 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:08:56 np0005596062 nova_compute[227313]: 2026-01-26 18:08:56.923 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:08:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:08:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:57.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:08:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:57.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:08:57 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1150032722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:08:57 np0005596062 nova_compute[227313]: 2026-01-26 18:08:57.357 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:08:57 np0005596062 nova_compute[227313]: 2026-01-26 18:08:57.362 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:08:57 np0005596062 nova_compute[227313]: 2026-01-26 18:08:57.430 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:08:57 np0005596062 nova_compute[227313]: 2026-01-26 18:08:57.451 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:08:57 np0005596062 nova_compute[227313]: 2026-01-26 18:08:57.451 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:57 np0005596062 podman[233193]: 2026-01-26 18:08:57.880471236 +0000 UTC m=+0.087330470 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.004 227317 DEBUG nova.compute.manager [req-67e63d31-d290-4475-99c6-c77e1428fefb req-721a0f0b-2ea7-4e9c-9a17-b6cfc69a702c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.004 227317 DEBUG oslo_concurrency.lockutils [req-67e63d31-d290-4475-99c6-c77e1428fefb req-721a0f0b-2ea7-4e9c-9a17-b6cfc69a702c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.004 227317 DEBUG oslo_concurrency.lockutils [req-67e63d31-d290-4475-99c6-c77e1428fefb req-721a0f0b-2ea7-4e9c-9a17-b6cfc69a702c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.004 227317 DEBUG oslo_concurrency.lockutils [req-67e63d31-d290-4475-99c6-c77e1428fefb req-721a0f0b-2ea7-4e9c-9a17-b6cfc69a702c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.004 227317 DEBUG nova.compute.manager [req-67e63d31-d290-4475-99c6-c77e1428fefb req-721a0f0b-2ea7-4e9c-9a17-b6cfc69a702c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Processing event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.005 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.010 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450938.0106647, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.011 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.012 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.015 227317 INFO nova.virt.libvirt.driver [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Instance spawned successfully.#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.015 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.042 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.042 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.043 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.043 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.044 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.044 227317 DEBUG nova.virt.libvirt.driver [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.058 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.061 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.093 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.125 227317 INFO nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Took 12.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.126 227317 DEBUG nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.200 227317 INFO nova.compute.manager [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Took 19.05 seconds to build instance.#033[00m
Jan 26 13:08:58 np0005596062 nova_compute[227313]: 2026-01-26 18:08:58.226 227317 DEBUG oslo_concurrency.lockutils [None req-ea9bc0d4-d304-4b32-9795-5f66e4aa6e75 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:08:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:08:59.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:08:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:08:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:08:59.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:08:59 np0005596062 nova_compute[227313]: 2026-01-26 18:08:59.640 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.162 227317 DEBUG nova.compute.manager [req-ea441c3f-09f3-4c12-8d85-ed2e4aaa34cf req-ba554b1b-7bd1-4949-9118-2703d14f9b1a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.162 227317 DEBUG oslo_concurrency.lockutils [req-ea441c3f-09f3-4c12-8d85-ed2e4aaa34cf req-ba554b1b-7bd1-4949-9118-2703d14f9b1a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.163 227317 DEBUG oslo_concurrency.lockutils [req-ea441c3f-09f3-4c12-8d85-ed2e4aaa34cf req-ba554b1b-7bd1-4949-9118-2703d14f9b1a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.163 227317 DEBUG oslo_concurrency.lockutils [req-ea441c3f-09f3-4c12-8d85-ed2e4aaa34cf req-ba554b1b-7bd1-4949-9118-2703d14f9b1a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.163 227317 DEBUG nova.compute.manager [req-ea441c3f-09f3-4c12-8d85-ed2e4aaa34cf req-ba554b1b-7bd1-4949-9118-2703d14f9b1a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.164 227317 WARNING nova.compute.manager [req-ea441c3f-09f3-4c12-8d85-ed2e4aaa34cf req-ba554b1b-7bd1-4949-9118-2703d14f9b1a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:09:00 np0005596062 nova_compute[227313]: 2026-01-26 18:09:00.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:01 np0005596062 nova_compute[227313]: 2026-01-26 18:09:01.022 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Check if temp file /var/lib/nova/instances/tmp3ouej46o exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 26 13:09:01 np0005596062 nova_compute[227313]: 2026-01-26 18:09:01.022 227317 DEBUG nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3ouej46o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4c4b2733-13a7-49fe-bbfb-f3e063298716',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 26 13:09:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:01.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:01.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:03.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:03.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:04 np0005596062 nova_compute[227313]: 2026-01-26 18:09:04.644 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:05.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:05.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:05 np0005596062 nova_compute[227313]: 2026-01-26 18:09:05.932 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:06 np0005596062 podman[233316]: 2026-01-26 18:09:06.175554971 +0000 UTC m=+0.166086522 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 13:09:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:09:06 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/787657783' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:09:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:07.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:07.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:09:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:09:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:09:08 np0005596062 nova_compute[227313]: 2026-01-26 18:09:08.426 227317 DEBUG nova.compute.manager [req-43a89fa5-cdcb-4acc-95f8-c322753159fd req-ee570480-b350-4ace-8083-b65530cb512a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:08 np0005596062 nova_compute[227313]: 2026-01-26 18:09:08.428 227317 DEBUG oslo_concurrency.lockutils [req-43a89fa5-cdcb-4acc-95f8-c322753159fd req-ee570480-b350-4ace-8083-b65530cb512a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:08 np0005596062 nova_compute[227313]: 2026-01-26 18:09:08.428 227317 DEBUG oslo_concurrency.lockutils [req-43a89fa5-cdcb-4acc-95f8-c322753159fd req-ee570480-b350-4ace-8083-b65530cb512a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:08 np0005596062 nova_compute[227313]: 2026-01-26 18:09:08.428 227317 DEBUG oslo_concurrency.lockutils [req-43a89fa5-cdcb-4acc-95f8-c322753159fd req-ee570480-b350-4ace-8083-b65530cb512a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:08 np0005596062 nova_compute[227313]: 2026-01-26 18:09:08.428 227317 DEBUG nova.compute.manager [req-43a89fa5-cdcb-4acc-95f8-c322753159fd req-ee570480-b350-4ace-8083-b65530cb512a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:08 np0005596062 nova_compute[227313]: 2026-01-26 18:09:08.429 227317 DEBUG nova.compute.manager [req-43a89fa5-cdcb-4acc-95f8-c322753159fd req-ee570480-b350-4ace-8083-b65530cb512a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:09:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:09.158 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:09.159 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:09.159 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.193 227317 INFO nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Took 7.22 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.194 227317 DEBUG nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.212 227317 DEBUG nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3ouej46o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4c4b2733-13a7-49fe-bbfb-f3e063298716',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(10acc998-e5ff-412e-99fb-f31cf0378f0f),old_vol_attachment_ids={b5b60a57-95c9-48f2-a72a-66b14f738be8='5025e74b-c2b1-4272-a524-e7eeb678c73d'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.216 227317 DEBUG nova.objects.instance [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lazy-loading 'migration_context' on Instance uuid 4c4b2733-13a7-49fe-bbfb-f3e063298716 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.217 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.218 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.219 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 26 13:09:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:09.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.237 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Find same serial number: pos=1, serial=b5b60a57-95c9-48f2-a72a-66b14f738be8 _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.238 227317 DEBUG nova.virt.libvirt.vif [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-634605113',display_name='tempest-LiveMigrationTest-server-634605113',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-634605113',id=7,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:08:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-8pp60248',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:08:58Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.238 227317 DEBUG nova.network.os_vif_util [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converting VIF {"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.239 227317 DEBUG nova.network.os_vif_util [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.239 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 13:09:09 np0005596062 nova_compute[227313]:  <mac address="fa:16:3e:89:24:36"/>
Jan 26 13:09:09 np0005596062 nova_compute[227313]:  <model type="virtio"/>
Jan 26 13:09:09 np0005596062 nova_compute[227313]:  <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:09:09 np0005596062 nova_compute[227313]:  <mtu size="1442"/>
Jan 26 13:09:09 np0005596062 nova_compute[227313]:  <target dev="tapc3bd4b07-ea"/>
Jan 26 13:09:09 np0005596062 nova_compute[227313]: </interface>
Jan 26 13:09:09 np0005596062 nova_compute[227313]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.239 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 26 13:09:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:09.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.647 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.722 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:09:09 np0005596062 nova_compute[227313]: 2026-01-26 18:09:09.722 227317 INFO nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.329 227317 INFO nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.578 227317 DEBUG nova.compute.manager [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.578 227317 DEBUG oslo_concurrency.lockutils [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.579 227317 DEBUG oslo_concurrency.lockutils [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.579 227317 DEBUG oslo_concurrency.lockutils [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.579 227317 DEBUG nova.compute.manager [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.579 227317 WARNING nova.compute.manager [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.580 227317 DEBUG nova.compute.manager [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-changed-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.580 227317 DEBUG nova.compute.manager [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Refreshing instance network info cache due to event network-changed-c3bd4b07-ea7b-40da-8a33-0ac219177512. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.580 227317 DEBUG oslo_concurrency.lockutils [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.580 227317 DEBUG oslo_concurrency.lockutils [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.581 227317 DEBUG nova.network.neutron [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Refreshing network info cache for port c3bd4b07-ea7b-40da-8a33-0ac219177512 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.835 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.840 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 26 13:09:10 np0005596062 nova_compute[227313]: 2026-01-26 18:09:10.934 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:11.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:11.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.589 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.590 227317 DEBUG nova.virt.libvirt.migration [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.602 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450951.6018639, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.604 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:09:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.625 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.631 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.647 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 26 13:09:11 np0005596062 kernel: tapc3bd4b07-ea (unregistering): left promiscuous mode
Jan 26 13:09:11 np0005596062 NetworkManager[48993]: <info>  [1769450951.8331] device (tapc3bd4b07-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:09:11 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:11Z|00058|binding|INFO|Releasing lport c3bd4b07-ea7b-40da-8a33-0ac219177512 from this chassis (sb_readonly=0)
Jan 26 13:09:11 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:11Z|00059|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 down in Southbound
Jan 26 13:09:11 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:11Z|00060|binding|INFO|Removing iface tapc3bd4b07-ea ovn-installed in OVS
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.838 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.841 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:11.850 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:24:36 10.100.0.12'], port_security=['fa:16:3e:89:24:36 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c76f2593-4bbb-4cef-b447-9e180245ada6'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=c3bd4b07-ea7b-40da-8a33-0ac219177512) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:09:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:11.853 143929 INFO neutron.agent.ovn.metadata.agent [-] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b unbound from our chassis#033[00m
Jan 26 13:09:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:11.855 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0516cc55-93b8-4bf2-b595-d07702fa255b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:09:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:11.858 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[89c23b77-0851-455c-b89d-128025168475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:11.860 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b namespace which is not needed anymore#033[00m
Jan 26 13:09:11 np0005596062 nova_compute[227313]: 2026-01-26 18:09:11.901 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:11 np0005596062 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 26 13:09:11 np0005596062 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 13.712s CPU time.
Jan 26 13:09:11 np0005596062 systemd-machined[195380]: Machine qemu-4-instance-00000007 terminated.
Jan 26 13:09:11 np0005596062 virtqemud[226715]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-b5b60a57-95c9-48f2-a72a-66b14f738be8: No such file or directory
Jan 26 13:09:11 np0005596062 virtqemud[226715]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-b5b60a57-95c9-48f2-a72a-66b14f738be8: No such file or directory
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.009 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.014 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.024 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.025 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.025 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 26 13:09:12 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [NOTICE]   (233136) : haproxy version is 2.8.14-c23fe91
Jan 26 13:09:12 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [NOTICE]   (233136) : path to executable is /usr/sbin/haproxy
Jan 26 13:09:12 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [WARNING]  (233136) : Exiting Master process...
Jan 26 13:09:12 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [ALERT]    (233136) : Current worker (233138) exited with code 143 (Terminated)
Jan 26 13:09:12 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233131]: [WARNING]  (233136) : All workers exited. Exiting... (0)
Jan 26 13:09:12 np0005596062 systemd[1]: libpod-debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33.scope: Deactivated successfully.
Jan 26 13:09:12 np0005596062 podman[233452]: 2026-01-26 18:09:12.086290139 +0000 UTC m=+0.085187504 container died debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.093 227317 DEBUG nova.virt.libvirt.guest [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4c4b2733-13a7-49fe-bbfb-f3e063298716' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.094 227317 INFO nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migration operation has completed#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.094 227317 INFO nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] _post_live_migration() is started..#033[00m
Jan 26 13:09:12 np0005596062 systemd[1]: var-lib-containers-storage-overlay-08a58b7761373920f018a1da3ba20f4fb4a96a4aa42ec2f9713e82ef65831260-merged.mount: Deactivated successfully.
Jan 26 13:09:12 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33-userdata-shm.mount: Deactivated successfully.
Jan 26 13:09:12 np0005596062 podman[233452]: 2026-01-26 18:09:12.232555904 +0000 UTC m=+0.231453269 container cleanup debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 13:09:12 np0005596062 systemd[1]: libpod-conmon-debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33.scope: Deactivated successfully.
Jan 26 13:09:12 np0005596062 podman[233492]: 2026-01-26 18:09:12.642746027 +0000 UTC m=+0.388744555 container remove debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.653 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[03ce3d74-6547-4af5-831b-603d2cb29977]: (4, ('Mon Jan 26 06:09:11 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b (debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33)\ndebc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33\nMon Jan 26 06:09:12 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b (debc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33)\ndebc8bc3ca73c7d7b122f04471f9835774059cb1e65c8e507cc57f15ae930b33\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.656 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[15c153c6-fa49-4149-970d-ec7b70f34012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.657 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0516cc55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.659 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:12 np0005596062 kernel: tap0516cc55-90: left promiscuous mode
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.682 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.688 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1822ac76-af6d-4fab-b353-a407eaad0942]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.702 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7995dab9-97ce-49ed-94dc-8c67f50ec127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.714 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c74ad7f4-666d-4366-8f29-2b4b1a99bd82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.736 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[07bb51d7-6ddf-4ee6-875c-a0d544de5942]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469339, 'reachable_time': 41982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233511, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.740 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:09:12 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:12.740 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[dc31fe1f-a878-4cda-b133-d9bc356f54eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:12 np0005596062 systemd[1]: run-netns-ovnmeta\x2d0516cc55\x2d93b8\x2d4bf2\x2db595\x2dd07702fa255b.mount: Deactivated successfully.
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.810 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.810 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.811 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.811 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.811 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.811 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.811 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.812 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.812 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.812 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.812 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.813 227317 WARNING nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.813 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.813 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.813 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.814 227317 DEBUG oslo_concurrency.lockutils [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.814 227317 DEBUG nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:12 np0005596062 nova_compute[227313]: 2026-01-26 18:09:12.814 227317 WARNING nova.compute.manager [req-cc65001e-03d1-47b9-9eb8-09d8003fd107 req-40b31f9c-a932-49cb-b936-3d2026015a81 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.149 227317 DEBUG nova.network.neutron [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updated VIF entry in instance network info cache for port c3bd4b07-ea7b-40da-8a33-0ac219177512. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.150 227317 DEBUG nova.network.neutron [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating instance_info_cache with network_info: [{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.187 227317 DEBUG oslo_concurrency.lockutils [req-fb51a52d-cd9a-40a5-814f-4c6b4aec675b req-ee9595e9-b1d8-4729-997b-0caf400a5c30 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:09:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:13.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:13.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.700 227317 DEBUG nova.network.neutron [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Activated binding for port c3bd4b07-ea7b-40da-8a33-0ac219177512 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.701 227317 DEBUG nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.703 227317 DEBUG nova.virt.libvirt.vif [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-634605113',display_name='tempest-LiveMigrationTest-server-634605113',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-634605113',id=7,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:08:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-8pp60248',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:09:00Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.703 227317 DEBUG nova.network.os_vif_util [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converting VIF {"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.704 227317 DEBUG nova.network.os_vif_util [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.704 227317 DEBUG os_vif [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.707 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.707 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3bd4b07-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.710 227317 DEBUG nova.compute.manager [req-61801096-2d41-4eb4-9274-d2bb94b06130 req-4eeb126a-fdec-466b-9b1e-fefe461f9ecf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.710 227317 DEBUG oslo_concurrency.lockutils [req-61801096-2d41-4eb4-9274-d2bb94b06130 req-4eeb126a-fdec-466b-9b1e-fefe461f9ecf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.710 227317 DEBUG oslo_concurrency.lockutils [req-61801096-2d41-4eb4-9274-d2bb94b06130 req-4eeb126a-fdec-466b-9b1e-fefe461f9ecf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.711 227317 DEBUG oslo_concurrency.lockutils [req-61801096-2d41-4eb4-9274-d2bb94b06130 req-4eeb126a-fdec-466b-9b1e-fefe461f9ecf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.711 227317 DEBUG nova.compute.manager [req-61801096-2d41-4eb4-9274-d2bb94b06130 req-4eeb126a-fdec-466b-9b1e-fefe461f9ecf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.711 227317 DEBUG nova.compute.manager [req-61801096-2d41-4eb4-9274-d2bb94b06130 req-4eeb126a-fdec-466b-9b1e-fefe461f9ecf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.711 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.712 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.715 227317 INFO os_vif [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea')#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.716 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.716 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.716 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.716 227317 DEBUG nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.717 227317 INFO nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deleting instance files /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716_del#033[00m
Jan 26 13:09:13 np0005596062 nova_compute[227313]: 2026-01-26 18:09:13.717 227317 INFO nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deletion of /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716_del complete#033[00m
Jan 26 13:09:14 np0005596062 nova_compute[227313]: 2026-01-26 18:09:14.649 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.112 227317 DEBUG nova.compute.manager [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.113 227317 DEBUG oslo_concurrency.lockutils [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.113 227317 DEBUG oslo_concurrency.lockutils [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.113 227317 DEBUG oslo_concurrency.lockutils [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.113 227317 DEBUG nova.compute.manager [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.113 227317 WARNING nova.compute.manager [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.114 227317 DEBUG nova.compute.manager [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.114 227317 DEBUG oslo_concurrency.lockutils [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.114 227317 DEBUG oslo_concurrency.lockutils [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.114 227317 DEBUG oslo_concurrency.lockutils [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.114 227317 DEBUG nova.compute.manager [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:15 np0005596062 nova_compute[227313]: 2026-01-26 18:09:15.114 227317 WARNING nova.compute.manager [req-174eb4d4-f1b2-4aef-9e0a-4012f1b6b936 req-8775fdfa-b903-408e-83d5-6d1522d47bd4 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:09:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:15.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:09:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:09:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:17.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:17.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:18 np0005596062 nova_compute[227313]: 2026-01-26 18:09:18.712 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:19.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:19.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.648 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.648 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.649 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.651 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.684 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.685 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.685 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.685 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:09:19 np0005596062 nova_compute[227313]: 2026-01-26 18:09:19.686 227317 DEBUG oslo_concurrency.processutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:09:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/798403029' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.107 227317 DEBUG oslo_concurrency.processutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.309 227317 WARNING nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.310 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4863MB free_disk=20.94263458251953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.310 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.311 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.504 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Migration for instance 4c4b2733-13a7-49fe-bbfb-f3e063298716 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.527 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.584 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Migration 10acc998-e5ff-412e-99fb-f31cf0378f0f is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.584 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.585 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:09:20 np0005596062 nova_compute[227313]: 2026-01-26 18:09:20.630 227317 DEBUG oslo_concurrency.processutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:21.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:09:21 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1273812040' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.271 227317 DEBUG oslo_concurrency.processutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.641s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.281 227317 DEBUG nova.compute.provider_tree [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:09:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:21.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.310 227317 DEBUG nova.scheduler.client.report [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.337 227317 DEBUG nova.compute.resource_tracker [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.338 227317 DEBUG oslo_concurrency.lockutils [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.344 227317 INFO nova.compute.manager [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.439 227317 INFO nova.scheduler.client.report [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Deleted allocation for migration 10acc998-e5ff-412e-99fb-f31cf0378f0f#033[00m
Jan 26 13:09:21 np0005596062 nova_compute[227313]: 2026-01-26 18:09:21.440 227317 DEBUG nova.virt.libvirt.driver [None req-101dd287-f7f0-4e5c-b813-3d7fc5d02ccf 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 26 13:09:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:22 np0005596062 nova_compute[227313]: 2026-01-26 18:09:22.975 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Creating tmpfile /var/lib/nova/instances/tmpe_c_rmgx to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 26 13:09:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:23.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:23.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:23 np0005596062 nova_compute[227313]: 2026-01-26 18:09:23.716 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:24 np0005596062 nova_compute[227313]: 2026-01-26 18:09:24.033 227317 DEBUG nova.compute.manager [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe_c_rmgx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 26 13:09:24 np0005596062 nova_compute[227313]: 2026-01-26 18:09:24.702 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:25.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:25.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:25 np0005596062 nova_compute[227313]: 2026-01-26 18:09:25.512 227317 DEBUG nova.compute.manager [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe_c_rmgx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4c4b2733-13a7-49fe-bbfb-f3e063298716',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 26 13:09:25 np0005596062 nova_compute[227313]: 2026-01-26 18:09:25.562 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:09:25 np0005596062 nova_compute[227313]: 2026-01-26 18:09:25.563 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquired lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:09:25 np0005596062 nova_compute[227313]: 2026-01-26 18:09:25.563 227317 DEBUG nova.network.neutron [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:09:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.022 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769450952.020788, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.022 227317 INFO nova.compute.manager [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.066 227317 DEBUG nova.compute.manager [None req-976e7aed-ec54-4cee-9497-fe74c565216a - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.200 227317 DEBUG nova.network.neutron [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating instance_info_cache with network_info: [{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.255 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Releasing lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:09:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:27.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.258 227317 DEBUG os_brick.utils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.260 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.282 232828 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.283 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[27f8e2ee-9ed7-451f-8e78-ac9a11d63bba]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.286 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:27.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.300 232828 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.302 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[c47de3a8-0a00-45fc-98ce-f3116f2f762d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:c828cff26df4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.304 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.319 232828 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.319 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[7141a588-42d4-4c2f-9903-943619d0025b]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.321 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[6087b54a-2f49-4e0e-8925-1befe0cb6846]: (4, '5c33c4b0-14ac-46af-8c94-d3bb1b6300af') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.322 227317 DEBUG oslo_concurrency.processutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.355 227317 DEBUG oslo_concurrency.processutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] CMD "nvme version" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.357 227317 DEBUG os_brick.initiator.connectors.lightos [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.357 227317 DEBUG os_brick.initiator.connectors.lightos [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.358 227317 DEBUG os_brick.initiator.connectors.lightos [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 26 13:09:27 np0005596062 nova_compute[227313]: 2026-01-26 18:09:27.358 227317 DEBUG os_brick.utils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] <== get_connector_properties: return (98ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:c828cff26df4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5c33c4b0-14ac-46af-8c94-d3bb1b6300af', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 26 13:09:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:09:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/743329445' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:09:28 np0005596062 nova_compute[227313]: 2026-01-26 18:09:28.719 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:28 np0005596062 podman[233672]: 2026-01-26 18:09:28.896955297 +0000 UTC m=+0.101462476 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.001 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe_c_rmgx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4c4b2733-13a7-49fe-bbfb-f3e063298716',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={b5b60a57-95c9-48f2-a72a-66b14f738be8='ac2346bc-53c2-4bf5-b1e2-545f402e338e'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.002 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Creating instance directory: /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.003 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Ensure instance console log exists: /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.004 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.008 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.011 227317 DEBUG nova.virt.libvirt.vif [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-26T18:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-634605113',display_name='tempest-LiveMigrationTest-server-634605113',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-634605113',id=7,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:08:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-8pp60248',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:09:18Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.011 227317 DEBUG nova.network.os_vif_util [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converting VIF {"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.013 227317 DEBUG nova.network.os_vif_util [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.013 227317 DEBUG os_vif [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.014 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.015 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.015 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.022 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.022 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3bd4b07-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.023 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3bd4b07-ea, col_values=(('external_ids', {'iface-id': 'c3bd4b07-ea7b-40da-8a33-0ac219177512', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:24:36', 'vm-uuid': '4c4b2733-13a7-49fe-bbfb-f3e063298716'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.026 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:29 np0005596062 NetworkManager[48993]: <info>  [1769450969.0277] manager: (tapc3bd4b07-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.029 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.037 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.038 227317 INFO os_vif [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea')#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.045 227317 DEBUG nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.046 227317 DEBUG nova.compute.manager [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe_c_rmgx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4c4b2733-13a7-49fe-bbfb-f3e063298716',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={b5b60a57-95c9-48f2-a72a-66b14f738be8='ac2346bc-53c2-4bf5-b1e2-545f402e338e'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 26 13:09:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:29.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:29.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:29 np0005596062 nova_compute[227313]: 2026-01-26 18:09:29.704 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:30 np0005596062 nova_compute[227313]: 2026-01-26 18:09:30.526 227317 DEBUG nova.network.neutron [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 26 13:09:31 np0005596062 nova_compute[227313]: 2026-01-26 18:09:31.032 227317 DEBUG nova.compute.manager [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe_c_rmgx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='4c4b2733-13a7-49fe-bbfb-f3e063298716',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={b5b60a57-95c9-48f2-a72a-66b14f738be8='ac2346bc-53c2-4bf5-b1e2-545f402e338e'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 26 13:09:31 np0005596062 systemd[1]: Starting libvirt proxy daemon...
Jan 26 13:09:31 np0005596062 systemd[1]: Started libvirt proxy daemon.
Jan 26 13:09:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:31.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:31.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:31 np0005596062 kernel: tapc3bd4b07-ea: entered promiscuous mode
Jan 26 13:09:31 np0005596062 NetworkManager[48993]: <info>  [1769450971.3624] manager: (tapc3bd4b07-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 26 13:09:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:31Z|00061|binding|INFO|Claiming lport c3bd4b07-ea7b-40da-8a33-0ac219177512 for this additional chassis.
Jan 26 13:09:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:31Z|00062|binding|INFO|c3bd4b07-ea7b-40da-8a33-0ac219177512: Claiming fa:16:3e:89:24:36 10.100.0.12
Jan 26 13:09:31 np0005596062 nova_compute[227313]: 2026-01-26 18:09:31.363 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:31 np0005596062 nova_compute[227313]: 2026-01-26 18:09:31.380 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:31 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:31Z|00063|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 ovn-installed in OVS
Jan 26 13:09:31 np0005596062 nova_compute[227313]: 2026-01-26 18:09:31.383 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:31 np0005596062 systemd-machined[195380]: New machine qemu-5-instance-00000007.
Jan 26 13:09:31 np0005596062 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Jan 26 13:09:31 np0005596062 systemd-udevd[233727]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:09:31 np0005596062 NetworkManager[48993]: <info>  [1769450971.4571] device (tapc3bd4b07-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:09:31 np0005596062 NetworkManager[48993]: <info>  [1769450971.4586] device (tapc3bd4b07-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:09:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.392 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450972.3923578, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.393 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Started (Lifecycle Event)#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.414 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.883 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769450972.8828573, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.884 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.906 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.911 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:09:32 np0005596062 nova_compute[227313]: 2026-01-26 18:09:32.943 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 26 13:09:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:33.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:33.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:34 np0005596062 nova_compute[227313]: 2026-01-26 18:09:34.027 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:34 np0005596062 nova_compute[227313]: 2026-01-26 18:09:34.765 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:34Z|00064|binding|INFO|Claiming lport c3bd4b07-ea7b-40da-8a33-0ac219177512 for this chassis.
Jan 26 13:09:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:34Z|00065|binding|INFO|c3bd4b07-ea7b-40da-8a33-0ac219177512: Claiming fa:16:3e:89:24:36 10.100.0.12
Jan 26 13:09:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:34Z|00066|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 up in Southbound
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.909 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:24:36 10.100.0.12'], port_security=['fa:16:3e:89:24:36 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '21', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=c3bd4b07-ea7b-40da-8a33-0ac219177512) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.911 143929 INFO neutron.agent.ovn.metadata.agent [-] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b bound to our chassis#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.915 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0516cc55-93b8-4bf2-b595-d07702fa255b#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.929 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cab4efbd-99d1-4c7f-aa0d-2d8613e4be4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.930 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0516cc55-91 in ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.934 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0516cc55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.934 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c22a41-9a20-4942-81f9-d4f5e59bf81e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.936 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb973f3-bde1-4cd6-a28a-a9a5152667ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.949 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d39d24-8025-4be0-bbc7-fdad9e864806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:34.973 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[31c49138-5814-4450-8f7d-cb9da305bae8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.003 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[5563b3ef-90e1-43bc-a2dc-3ba230e5b700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.010 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ec768f-c0c5-4722-91b3-3ac252ea245f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 NetworkManager[48993]: <info>  [1769450975.0123] manager: (tap0516cc55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 26 13:09:35 np0005596062 systemd-udevd[233787]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.045 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[c11c2513-326e-4fd6-940c-0b4667fec29f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.048 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[14284538-95a2-49b2-b42f-82d2ce00e8eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 NetworkManager[48993]: <info>  [1769450975.0781] device (tap0516cc55-90): carrier: link connected
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.081 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[0c280537-5abe-4c9b-95a0-79511b0f6559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.101 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[8b77317b-3be8-48f7-a49e-b3a37bb30a1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0516cc55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:40:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473512, 'reachable_time': 28692, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233806, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.118 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bf87de54-6ba4-49d1-94b4-d321479a835d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:40ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473512, 'tstamp': 473512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233807, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.139 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ec9e1042-b31d-46ba-8f9e-2c2ddf023873]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0516cc55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:40:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473512, 'reachable_time': 28692, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233808, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.167 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c407edf6-db4c-426a-b9fc-ba6b41e57489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.169 227317 INFO nova.compute.manager [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Post operation of migration started#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.235 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e6984c-1108-4741-ab4c-26cacf78dc08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.238 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0516cc55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.239 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.239 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0516cc55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:35 np0005596062 NetworkManager[48993]: <info>  [1769450975.2433] manager: (tap0516cc55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 26 13:09:35 np0005596062 kernel: tap0516cc55-90: entered promiscuous mode
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.243 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.246 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0516cc55-90, col_values=(('external_ids', {'iface-id': '46cfbba6-430a-495c-9d6a-60cf58c877d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.248 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:35Z|00067|binding|INFO|Releasing lport 46cfbba6-430a-495c-9d6a-60cf58c877d3 from this chassis (sb_readonly=0)
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.262 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.263 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.264 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ee670122-8f37-4a82-9282-528334156e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.265 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-0516cc55-93b8-4bf2-b595-d07702fa255b
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/0516cc55-93b8-4bf2-b595-d07702fa255b.pid.haproxy
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 0516cc55-93b8-4bf2-b595-d07702fa255b
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:09:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:35.266 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'env', 'PROCESS_TAG=haproxy-0516cc55-93b8-4bf2-b595-d07702fa255b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0516cc55-93b8-4bf2-b595-d07702fa255b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:09:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:35.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:35.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.513 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.514 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquired lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:09:35 np0005596062 nova_compute[227313]: 2026-01-26 18:09:35.514 227317 DEBUG nova.network.neutron [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:09:35 np0005596062 podman[233841]: 2026-01-26 18:09:35.639113641 +0000 UTC m=+0.022731857 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:09:36 np0005596062 podman[233841]: 2026-01-26 18:09:36.062959865 +0000 UTC m=+0.446578051 container create 1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 13:09:36 np0005596062 systemd[1]: Started libpod-conmon-1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6.scope.
Jan 26 13:09:36 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:09:36 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fc7f941721d5260a10036a8d9c2f468fa8b5b0cc529e679dbb92520f43d4a19/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:09:36 np0005596062 podman[233841]: 2026-01-26 18:09:36.185577202 +0000 UTC m=+0.569195418 container init 1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:09:36 np0005596062 podman[233841]: 2026-01-26 18:09:36.196910884 +0000 UTC m=+0.580529080 container start 1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 13:09:36 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [NOTICE]   (233862) : New worker (233864) forked
Jan 26 13:09:36 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [NOTICE]   (233862) : Loading success.
Jan 26 13:09:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:36 np0005596062 podman[233873]: 2026-01-26 18:09:36.904771666 +0000 UTC m=+0.104534046 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 13:09:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:37.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:37.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:37 np0005596062 nova_compute[227313]: 2026-01-26 18:09:37.408 227317 DEBUG nova.network.neutron [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating instance_info_cache with network_info: [{"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:09:37 np0005596062 nova_compute[227313]: 2026-01-26 18:09:37.429 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Releasing lock "refresh_cache-4c4b2733-13a7-49fe-bbfb-f3e063298716" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:09:37 np0005596062 nova_compute[227313]: 2026-01-26 18:09:37.446 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:37 np0005596062 nova_compute[227313]: 2026-01-26 18:09:37.447 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:37 np0005596062 nova_compute[227313]: 2026-01-26 18:09:37.447 227317 DEBUG oslo_concurrency.lockutils [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:37 np0005596062 nova_compute[227313]: 2026-01-26 18:09:37.453 227317 INFO nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 26 13:09:37 np0005596062 virtqemud[226715]: Domain id=5 name='instance-00000007' uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716 is tainted: custom-monitor
Jan 26 13:09:38 np0005596062 nova_compute[227313]: 2026-01-26 18:09:38.464 227317 INFO nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.030 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:39.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:39.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.471 227317 INFO nova.virt.libvirt.driver [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.476 227317 DEBUG nova.compute.manager [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.505 227317 DEBUG nova.objects.instance [None req-0e651a6d-bbb5-488a-bbe8-7c3372884e76 430881eef73e44a38752c2354824111c 9a36b7a9c98845ffaadadf6d0a7eb3a8 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.769 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.943 227317 DEBUG nova.compute.manager [req-13a2c9c9-38c0-4567-8812-357c8422962f req-c3beb139-8740-432d-a51b-244f27717742 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.943 227317 DEBUG oslo_concurrency.lockutils [req-13a2c9c9-38c0-4567-8812-357c8422962f req-c3beb139-8740-432d-a51b-244f27717742 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.944 227317 DEBUG oslo_concurrency.lockutils [req-13a2c9c9-38c0-4567-8812-357c8422962f req-c3beb139-8740-432d-a51b-244f27717742 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.944 227317 DEBUG oslo_concurrency.lockutils [req-13a2c9c9-38c0-4567-8812-357c8422962f req-c3beb139-8740-432d-a51b-244f27717742 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.944 227317 DEBUG nova.compute.manager [req-13a2c9c9-38c0-4567-8812-357c8422962f req-c3beb139-8740-432d-a51b-244f27717742 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:39 np0005596062 nova_compute[227313]: 2026-01-26 18:09:39.945 227317 WARNING nova.compute.manager [req-13a2c9c9-38c0-4567-8812-357c8422962f req-c3beb139-8740-432d-a51b-244f27717742 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:09:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:41.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:41.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:42 np0005596062 nova_compute[227313]: 2026-01-26 18:09:42.343 227317 DEBUG nova.compute.manager [req-44e16842-6857-4224-a582-96d7c2edf5a3 req-679f0389-4868-440f-b4ae-7c73bed0c2b9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:42 np0005596062 nova_compute[227313]: 2026-01-26 18:09:42.344 227317 DEBUG oslo_concurrency.lockutils [req-44e16842-6857-4224-a582-96d7c2edf5a3 req-679f0389-4868-440f-b4ae-7c73bed0c2b9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:42 np0005596062 nova_compute[227313]: 2026-01-26 18:09:42.344 227317 DEBUG oslo_concurrency.lockutils [req-44e16842-6857-4224-a582-96d7c2edf5a3 req-679f0389-4868-440f-b4ae-7c73bed0c2b9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:42 np0005596062 nova_compute[227313]: 2026-01-26 18:09:42.345 227317 DEBUG oslo_concurrency.lockutils [req-44e16842-6857-4224-a582-96d7c2edf5a3 req-679f0389-4868-440f-b4ae-7c73bed0c2b9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:42 np0005596062 nova_compute[227313]: 2026-01-26 18:09:42.345 227317 DEBUG nova.compute.manager [req-44e16842-6857-4224-a582-96d7c2edf5a3 req-679f0389-4868-440f-b4ae-7c73bed0c2b9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:42 np0005596062 nova_compute[227313]: 2026-01-26 18:09:42.345 227317 WARNING nova.compute.manager [req-44e16842-6857-4224-a582-96d7c2edf5a3 req-679f0389-4868-440f-b4ae-7c73bed0c2b9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:09:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:43.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:43.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.889 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.890 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.890 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.891 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.891 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.893 227317 INFO nova.compute.manager [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Terminating instance#033[00m
Jan 26 13:09:43 np0005596062 nova_compute[227313]: 2026-01-26 18:09:43.895 227317 DEBUG nova.compute.manager [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:09:43 np0005596062 kernel: tapc3bd4b07-ea (unregistering): left promiscuous mode
Jan 26 13:09:43 np0005596062 NetworkManager[48993]: <info>  [1769450983.9444] device (tapc3bd4b07-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00068|binding|INFO|Releasing lport c3bd4b07-ea7b-40da-8a33-0ac219177512 from this chassis (sb_readonly=0)
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.012 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00069|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 down in Southbound
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00070|binding|INFO|Removing iface tapc3bd4b07-ea ovn-installed in OVS
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.015 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.019 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:24:36 10.100.0.12'], port_security=['fa:16:3e:89:24:36 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '23', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=c3bd4b07-ea7b-40da-8a33-0ac219177512) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.021 143929 INFO neutron.agent.ovn.metadata.agent [-] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b unbound from our chassis#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.022 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0516cc55-93b8-4bf2-b595-d07702fa255b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.023 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[dd8e3596-3137-4e38-8839-472f53cce17b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.024 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b namespace which is not needed anymore#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.028 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.031 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 26 13:09:44 np0005596062 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 1.854s CPU time.
Jan 26 13:09:44 np0005596062 systemd-machined[195380]: Machine qemu-5-instance-00000007 terminated.
Jan 26 13:09:44 np0005596062 kernel: tapc3bd4b07-ea: entered promiscuous mode
Jan 26 13:09:44 np0005596062 NetworkManager[48993]: <info>  [1769450984.1155] manager: (tapc3bd4b07-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 26 13:09:44 np0005596062 systemd-udevd[233911]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.116 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00071|binding|INFO|Claiming lport c3bd4b07-ea7b-40da-8a33-0ac219177512 for this chassis.
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00072|binding|INFO|c3bd4b07-ea7b-40da-8a33-0ac219177512: Claiming fa:16:3e:89:24:36 10.100.0.12
Jan 26 13:09:44 np0005596062 kernel: tapc3bd4b07-ea (unregistering): left promiscuous mode
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.131 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:24:36 10.100.0.12'], port_security=['fa:16:3e:89:24:36 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '23', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=c3bd4b07-ea7b-40da-8a33-0ac219177512) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.139 227317 INFO nova.virt.libvirt.driver [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Instance destroyed successfully.#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.140 227317 DEBUG nova.objects.instance [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lazy-loading 'resources' on Instance uuid 4c4b2733-13a7-49fe-bbfb-f3e063298716 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00073|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 ovn-installed in OVS
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00074|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 up in Southbound
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.148 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00075|binding|INFO|Releasing lport c3bd4b07-ea7b-40da-8a33-0ac219177512 from this chassis (sb_readonly=1)
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00076|if_status|INFO|Not setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 down as sb is readonly
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.149 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00077|binding|INFO|Removing iface tapc3bd4b07-ea ovn-installed in OVS
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.151 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00078|binding|INFO|Releasing lport c3bd4b07-ea7b-40da-8a33-0ac219177512 from this chassis (sb_readonly=0)
Jan 26 13:09:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:09:44Z|00079|binding|INFO|Setting lport c3bd4b07-ea7b-40da-8a33-0ac219177512 down in Southbound
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.156 227317 DEBUG nova.virt.libvirt.vif [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-26T18:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-634605113',display_name='tempest-LiveMigrationTest-server-634605113',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-634605113',id=7,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:08:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b1f2cad350784d7eae39fc23fb032500',ramdisk_id='',reservation_id='r-8pp60248',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-877386369',owner_user_name='tempest-LiveMigrationTest-877386369-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:09:39Z,user_data=None,user_id='9e3f505042e7463683259f02e8e59eca',uuid=4c4b2733-13a7-49fe-bbfb-f3e063298716,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.156 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:24:36 10.100.0.12'], port_security=['fa:16:3e:89:24:36 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c4b2733-13a7-49fe-bbfb-f3e063298716', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0516cc55-93b8-4bf2-b595-d07702fa255b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1f2cad350784d7eae39fc23fb032500', 'neutron:revision_number': '23', 'neutron:security_group_ids': '4e1bd851-4cc2-4677-be2e-39f74460bffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db9bad5b-1a88-4481-85c1-c131f59dea19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=c3bd4b07-ea7b-40da-8a33-0ac219177512) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.157 227317 DEBUG nova.network.os_vif_util [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converting VIF {"id": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "address": "fa:16:3e:89:24:36", "network": {"id": "0516cc55-93b8-4bf2-b595-d07702fa255b", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1766120094-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1f2cad350784d7eae39fc23fb032500", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bd4b07-ea", "ovs_interfaceid": "c3bd4b07-ea7b-40da-8a33-0ac219177512", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.158 227317 DEBUG nova.network.os_vif_util [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.158 227317 DEBUG os_vif [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.160 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.160 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3bd4b07-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.162 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.164 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.164 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.167 227317 INFO os_vif [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:24:36,bridge_name='br-int',has_traffic_filtering=True,id=c3bd4b07-ea7b-40da-8a33-0ac219177512,network=Network(0516cc55-93b8-4bf2-b595-d07702fa255b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bd4b07-ea')#033[00m
Jan 26 13:09:44 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [NOTICE]   (233862) : haproxy version is 2.8.14-c23fe91
Jan 26 13:09:44 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [NOTICE]   (233862) : path to executable is /usr/sbin/haproxy
Jan 26 13:09:44 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [WARNING]  (233862) : Exiting Master process...
Jan 26 13:09:44 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [WARNING]  (233862) : Exiting Master process...
Jan 26 13:09:44 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [ALERT]    (233862) : Current worker (233864) exited with code 143 (Terminated)
Jan 26 13:09:44 np0005596062 neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b[233858]: [WARNING]  (233862) : All workers exited. Exiting... (0)
Jan 26 13:09:44 np0005596062 systemd[1]: libpod-1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6.scope: Deactivated successfully.
Jan 26 13:09:44 np0005596062 podman[233934]: 2026-01-26 18:09:44.210212588 +0000 UTC m=+0.052676125 container died 1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 13:09:44 np0005596062 systemd[1]: var-lib-containers-storage-overlay-6fc7f941721d5260a10036a8d9c2f468fa8b5b0cc529e679dbb92520f43d4a19-merged.mount: Deactivated successfully.
Jan 26 13:09:44 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6-userdata-shm.mount: Deactivated successfully.
Jan 26 13:09:44 np0005596062 podman[233934]: 2026-01-26 18:09:44.416782562 +0000 UTC m=+0.259246079 container cleanup 1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:09:44 np0005596062 systemd[1]: libpod-conmon-1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6.scope: Deactivated successfully.
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.555 227317 DEBUG nova.compute.manager [req-cb9fa9a9-ee21-4490-b8ed-fdce6b19de82 req-c9c015be-cc42-4538-8783-1bdee318982b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.556 227317 DEBUG oslo_concurrency.lockutils [req-cb9fa9a9-ee21-4490-b8ed-fdce6b19de82 req-c9c015be-cc42-4538-8783-1bdee318982b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.556 227317 DEBUG oslo_concurrency.lockutils [req-cb9fa9a9-ee21-4490-b8ed-fdce6b19de82 req-c9c015be-cc42-4538-8783-1bdee318982b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.557 227317 DEBUG oslo_concurrency.lockutils [req-cb9fa9a9-ee21-4490-b8ed-fdce6b19de82 req-c9c015be-cc42-4538-8783-1bdee318982b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.557 227317 DEBUG nova.compute.manager [req-cb9fa9a9-ee21-4490-b8ed-fdce6b19de82 req-c9c015be-cc42-4538-8783-1bdee318982b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.557 227317 DEBUG nova.compute.manager [req-cb9fa9a9-ee21-4490-b8ed-fdce6b19de82 req-c9c015be-cc42-4538-8783-1bdee318982b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-unplugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:09:44 np0005596062 podman[233982]: 2026-01-26 18:09:44.567472407 +0000 UTC m=+0.127189250 container remove 1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.577 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[de67e05c-4935-4b59-a277-700449ce6415]: (4, ('Mon Jan 26 06:09:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b (1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6)\n1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6\nMon Jan 26 06:09:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b (1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6)\n1c1aebaf1c5123861edc42b332e7d514e20471e00c26f10237e9bcaccf5956c6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.579 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f51664b1-0be4-4e44-9982-fde68ab89874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.580 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0516cc55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.582 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 kernel: tap0516cc55-90: left promiscuous mode
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.597 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.600 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cc59e400-00b5-4b4c-b4b5-887531b57e4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.624 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ebef6c71-f1c3-45e1-921a-7d2159907970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.625 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0a524f-2d6f-419d-827e-1a749b6716dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.640 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[79e91843-80c5-4b36-8316-5f79f23dd0ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473504, 'reachable_time': 16042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233997, 'error': None, 'target': 'ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.643 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0516cc55-93b8-4bf2-b595-d07702fa255b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.643 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[7e684d84-ce9e-46cc-b8fc-3497030f5dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.644 143929 INFO neutron.agent.ovn.metadata.agent [-] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b unbound from our chassis#033[00m
Jan 26 13:09:44 np0005596062 systemd[1]: run-netns-ovnmeta\x2d0516cc55\x2d93b8\x2d4bf2\x2db595\x2dd07702fa255b.mount: Deactivated successfully.
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.647 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0516cc55-93b8-4bf2-b595-d07702fa255b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.647 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0aaba5-4914-4e90-8539-66cf00336d29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.648 143929 INFO neutron.agent.ovn.metadata.agent [-] Port c3bd4b07-ea7b-40da-8a33-0ac219177512 in datapath 0516cc55-93b8-4bf2-b595-d07702fa255b unbound from our chassis#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.651 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0516cc55-93b8-4bf2-b595-d07702fa255b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:09:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:44.652 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[942b8721-3c4f-4b81-b3a0-1f7bea8340b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.771 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.947 227317 INFO nova.virt.libvirt.driver [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deleting instance files /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716_del#033[00m
Jan 26 13:09:44 np0005596062 nova_compute[227313]: 2026-01-26 18:09:44.948 227317 INFO nova.virt.libvirt.driver [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deletion of /var/lib/nova/instances/4c4b2733-13a7-49fe-bbfb-f3e063298716_del complete#033[00m
Jan 26 13:09:45 np0005596062 nova_compute[227313]: 2026-01-26 18:09:45.005 227317 INFO nova.compute.manager [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:09:45 np0005596062 nova_compute[227313]: 2026-01-26 18:09:45.005 227317 DEBUG oslo.service.loopingcall [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:09:45 np0005596062 nova_compute[227313]: 2026-01-26 18:09:45.006 227317 DEBUG nova.compute.manager [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:09:45 np0005596062 nova_compute[227313]: 2026-01-26 18:09:45.006 227317 DEBUG nova.network.neutron [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:09:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:45.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:45.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.366 227317 DEBUG nova.network.neutron [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.390 227317 INFO nova.compute.manager [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Took 1.38 seconds to deallocate network for instance.#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.569 227317 DEBUG nova.compute.manager [req-8bc7cce8-2b94-4a25-bbec-9532e11eaf5a req-6319d6ba-87af-457e-91b0-c234ecd1c1d3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-deleted-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.707 227317 DEBUG nova.compute.manager [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.708 227317 DEBUG oslo_concurrency.lockutils [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.708 227317 DEBUG oslo_concurrency.lockutils [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.708 227317 DEBUG oslo_concurrency.lockutils [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.709 227317 DEBUG nova.compute.manager [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.709 227317 WARNING nova.compute.manager [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.709 227317 DEBUG nova.compute.manager [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.709 227317 DEBUG oslo_concurrency.lockutils [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.710 227317 DEBUG oslo_concurrency.lockutils [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.710 227317 DEBUG oslo_concurrency.lockutils [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.710 227317 DEBUG nova.compute.manager [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] No waiting events found dispatching network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.710 227317 WARNING nova.compute.manager [req-db34c3aa-61f0-4881-a8b3-77edda5964f4 req-2f8d45f6-5f50-479b-ad11-039688d06530 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Received unexpected event network-vif-plugged-c3bd4b07-ea7b-40da-8a33-0ac219177512 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.723 227317 INFO nova.compute.manager [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Took 0.33 seconds to detach 1 volumes for instance.#033[00m
Jan 26 13:09:46 np0005596062 nova_compute[227313]: 2026-01-26 18:09:46.724 227317 DEBUG nova.compute.manager [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Deleting volume: b5b60a57-95c9-48f2-a72a-66b14f738be8 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 26 13:09:47 np0005596062 nova_compute[227313]: 2026-01-26 18:09:47.156 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:47 np0005596062 nova_compute[227313]: 2026-01-26 18:09:47.156 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:47 np0005596062 nova_compute[227313]: 2026-01-26 18:09:47.163 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:47 np0005596062 nova_compute[227313]: 2026-01-26 18:09:47.201 227317 INFO nova.scheduler.client.report [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Deleted allocations for instance 4c4b2733-13a7-49fe-bbfb-f3e063298716#033[00m
Jan 26 13:09:47 np0005596062 nova_compute[227313]: 2026-01-26 18:09:47.265 227317 DEBUG oslo_concurrency.lockutils [None req-237cb5ec-05d2-48ba-a4f2-3fcb96640816 9e3f505042e7463683259f02e8e59eca b1f2cad350784d7eae39fc23fb032500 - - default default] Lock "4c4b2733-13a7-49fe-bbfb-f3e063298716" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:47.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:47.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:49 np0005596062 nova_compute[227313]: 2026-01-26 18:09:49.164 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:49.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:49.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:49 np0005596062 nova_compute[227313]: 2026-01-26 18:09:49.822 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:51.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:09:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:51.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:09:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:53.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:53.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:53 np0005596062 nova_compute[227313]: 2026-01-26 18:09:53.453 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:54 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:54.022 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.022 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:54 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:54.024 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:09:54 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:09:54.026 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.045 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.166 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:54 np0005596062 nova_compute[227313]: 2026-01-26 18:09:54.825 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:55 np0005596062 nova_compute[227313]: 2026-01-26 18:09:55.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:55.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:55.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:56 np0005596062 nova_compute[227313]: 2026-01-26 18:09:56.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:56 np0005596062 nova_compute[227313]: 2026-01-26 18:09:56.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.100 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.101 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.101 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.131 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.131 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.156 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.156 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.157 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.157 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.158 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:57.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:57.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:09:57 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2074920665' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.626 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.776 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.777 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4881MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.777 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.777 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.840 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.841 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:09:57 np0005596062 nova_compute[227313]: 2026-01-26 18:09:57.857 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:09:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:09:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1452272809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:09:58 np0005596062 nova_compute[227313]: 2026-01-26 18:09:58.326 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:09:58 np0005596062 nova_compute[227313]: 2026-01-26 18:09:58.332 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:09:58 np0005596062 nova_compute[227313]: 2026-01-26 18:09:58.348 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:09:58 np0005596062 nova_compute[227313]: 2026-01-26 18:09:58.350 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:09:58 np0005596062 nova_compute[227313]: 2026-01-26 18:09:58.351 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.138 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769450984.1365726, 4c4b2733-13a7-49fe-bbfb-f3e063298716 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.138 227317 INFO nova.compute.manager [-] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.161 227317 DEBUG nova.compute.manager [None req-9fbe0e0d-6bca-482e-bea6-3f50240d14ff - - - - - -] [instance: 4c4b2733-13a7-49fe-bbfb-f3e063298716] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:09:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:09:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:09:59.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:09:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:09:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:09:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:09:59.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.826 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.827 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.828 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.828 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.851 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:09:59 np0005596062 nova_compute[227313]: 2026-01-26 18:09:59.852 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:09:59 np0005596062 podman[234101]: 2026-01-26 18:09:59.969226382 +0000 UTC m=+0.088167780 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.784 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.784 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.820 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.890 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.891 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.927 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:10:00 np0005596062 nova_compute[227313]: 2026-01-26 18:10:00.928 227317 INFO nova.compute.claims [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:10:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:01.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:01.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.382 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.613 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:10:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1331398528' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.879 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.886 227317 DEBUG nova.compute.provider_tree [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.910 227317 DEBUG nova.scheduler.client.report [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.940 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:01 np0005596062 nova_compute[227313]: 2026-01-26 18:10:01.941 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.011 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.012 227317 DEBUG nova.network.neutron [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.042 227317 INFO nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.059 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.196 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.197 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.198 227317 INFO nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Creating image(s)#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.229 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.259 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.287 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.291 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.314 227317 DEBUG nova.policy [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6edbbcec1ba44e8e815998a86fd7dcbb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5d9e93b058648dd9344a83f8a43b553', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.353 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.354 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.355 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.356 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.393 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:02 np0005596062 nova_compute[227313]: 2026-01-26 18:10:02.398 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:02 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:10:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:03.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:03.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:03 np0005596062 nova_compute[227313]: 2026-01-26 18:10:03.544 227317 DEBUG nova.network.neutron [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Successfully created port: 252e1f93-8a26-43a8-aabc-14548c4d04d5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:10:03 np0005596062 nova_compute[227313]: 2026-01-26 18:10:03.827 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:03 np0005596062 nova_compute[227313]: 2026-01-26 18:10:03.924 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] resizing rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.083 227317 DEBUG nova.objects.instance [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lazy-loading 'migration_context' on Instance uuid e3ae83e1-a1df-447d-aeb0-61a2999954d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.107 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.107 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Ensure instance console log exists: /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.108 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.108 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.108 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.356 227317 DEBUG nova.network.neutron [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Successfully updated port: 252e1f93-8a26-43a8-aabc-14548c4d04d5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.382 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "refresh_cache-e3ae83e1-a1df-447d-aeb0-61a2999954d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.383 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquired lock "refresh_cache-e3ae83e1-a1df-447d-aeb0-61a2999954d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.384 227317 DEBUG nova.network.neutron [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.478 227317 DEBUG nova.compute.manager [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-changed-252e1f93-8a26-43a8-aabc-14548c4d04d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.478 227317 DEBUG nova.compute.manager [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Refreshing instance network info cache due to event network-changed-252e1f93-8a26-43a8-aabc-14548c4d04d5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.479 227317 DEBUG oslo_concurrency.lockutils [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-e3ae83e1-a1df-447d-aeb0-61a2999954d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.568 227317 DEBUG nova.network.neutron [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:10:04 np0005596062 nova_compute[227313]: 2026-01-26 18:10:04.854 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:05.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:05.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.705 227317 DEBUG nova.network.neutron [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Updating instance_info_cache with network_info: [{"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.733 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Releasing lock "refresh_cache-e3ae83e1-a1df-447d-aeb0-61a2999954d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.733 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Instance network_info: |[{"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.734 227317 DEBUG oslo_concurrency.lockutils [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-e3ae83e1-a1df-447d-aeb0-61a2999954d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.734 227317 DEBUG nova.network.neutron [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Refreshing network info cache for port 252e1f93-8a26-43a8-aabc-14548c4d04d5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.738 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Start _get_guest_xml network_info=[{"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.744 227317 WARNING nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.750 227317 DEBUG nova.virt.libvirt.host [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.752 227317 DEBUG nova.virt.libvirt.host [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.756 227317 DEBUG nova.virt.libvirt.host [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.756 227317 DEBUG nova.virt.libvirt.host [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.758 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.758 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.759 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.759 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.759 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.759 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.760 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.760 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.760 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.760 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.760 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.761 227317 DEBUG nova.virt.hardware [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:10:06 np0005596062 nova_compute[227313]: 2026-01-26 18:10:06.766 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:10:07 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3658519192' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:10:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:07.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:07.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:07 np0005596062 podman[234333]: 2026-01-26 18:10:07.888658574 +0000 UTC m=+0.099698467 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.017 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.251s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.041 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.045 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:10:08 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1613520140' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.556 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.559 227317 DEBUG nova.virt.libvirt.vif [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-667418438',display_name='tempest-ImagesOneServerTestJSON-server-667418438',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-667418438',id=9,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d9e93b058648dd9344a83f8a43b553',ramdisk_id='',reservation_id='r-ilez556x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-410481541',owner_user_name='tempest-ImagesOneServerTestJSON-410481541-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:10:02Z,user_data=None,user_id='6edbbcec1ba44e8e815998a86fd7dcbb',uuid=e3ae83e1-a1df-447d-aeb0-61a2999954d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.559 227317 DEBUG nova.network.os_vif_util [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Converting VIF {"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.560 227317 DEBUG nova.network.os_vif_util [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.561 227317 DEBUG nova.objects.instance [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lazy-loading 'pci_devices' on Instance uuid e3ae83e1-a1df-447d-aeb0-61a2999954d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.581 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <uuid>e3ae83e1-a1df-447d-aeb0-61a2999954d3</uuid>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <name>instance-00000009</name>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:name>tempest-ImagesOneServerTestJSON-server-667418438</nova:name>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:10:06</nova:creationTime>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:user uuid="6edbbcec1ba44e8e815998a86fd7dcbb">tempest-ImagesOneServerTestJSON-410481541-project-member</nova:user>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:project uuid="a5d9e93b058648dd9344a83f8a43b553">tempest-ImagesOneServerTestJSON-410481541</nova:project>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <nova:port uuid="252e1f93-8a26-43a8-aabc-14548c4d04d5">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <entry name="serial">e3ae83e1-a1df-447d-aeb0-61a2999954d3</entry>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <entry name="uuid">e3ae83e1-a1df-447d-aeb0-61a2999954d3</entry>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk.config">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:a3:4d:08"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <target dev="tap252e1f93-8a"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/console.log" append="off"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:10:08 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:10:08 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:10:08 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:10:08 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.582 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Preparing to wait for external event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.583 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.583 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.583 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.584 227317 DEBUG nova.virt.libvirt.vif [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-667418438',display_name='tempest-ImagesOneServerTestJSON-server-667418438',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-667418438',id=9,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d9e93b058648dd9344a83f8a43b553',ramdisk_id='',reservation_id='r-ilez556x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-410481541',owner_user_name='tempest-ImagesOneServerTestJSON-410481541-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:10:02Z,user_data=None,user_id='6edbbcec1ba44e8e815998a86fd7dcbb',uuid=e3ae83e1-a1df-447d-aeb0-61a2999954d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.585 227317 DEBUG nova.network.os_vif_util [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Converting VIF {"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.585 227317 DEBUG nova.network.os_vif_util [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.586 227317 DEBUG os_vif [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.586 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.587 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.587 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.591 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.591 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap252e1f93-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.592 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap252e1f93-8a, col_values=(('external_ids', {'iface-id': '252e1f93-8a26-43a8-aabc-14548c4d04d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:4d:08', 'vm-uuid': 'e3ae83e1-a1df-447d-aeb0-61a2999954d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.594 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:08 np0005596062 NetworkManager[48993]: <info>  [1769451008.5948] manager: (tap252e1f93-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.596 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.601 227317 INFO os_vif [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a')#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.651 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.652 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.653 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] No VIF found with MAC fa:16:3e:a3:4d:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.654 227317 INFO nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Using config drive#033[00m
Jan 26 13:10:08 np0005596062 nova_compute[227313]: 2026-01-26 18:10:08.684 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.094 227317 INFO nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Creating config drive at /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/disk.config#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.101 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpknfb09h_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:09.159 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:09.159 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:09.160 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.232 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpknfb09h_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.263 227317 DEBUG nova.storage.rbd_utils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] rbd image e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.268 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/disk.config e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:09.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:09.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.855 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.961 227317 DEBUG nova.network.neutron [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Updated VIF entry in instance network info cache for port 252e1f93-8a26-43a8-aabc-14548c4d04d5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:10:09 np0005596062 nova_compute[227313]: 2026-01-26 18:10:09.962 227317 DEBUG nova.network.neutron [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Updating instance_info_cache with network_info: [{"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.021 227317 DEBUG oslo_concurrency.lockutils [req-1c84e49a-5abd-4c99-ab66-19f741aa7c32 req-ae3910cc-09a2-4721-949c-29ca6970e7dd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-e3ae83e1-a1df-447d-aeb0-61a2999954d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.331 227317 DEBUG oslo_concurrency.processutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/disk.config e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.332 227317 INFO nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Deleting local config drive /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3/disk.config because it was imported into RBD.#033[00m
Jan 26 13:10:10 np0005596062 kernel: tap252e1f93-8a: entered promiscuous mode
Jan 26 13:10:10 np0005596062 NetworkManager[48993]: <info>  [1769451010.3767] manager: (tap252e1f93-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.376 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:10Z|00080|binding|INFO|Claiming lport 252e1f93-8a26-43a8-aabc-14548c4d04d5 for this chassis.
Jan 26 13:10:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:10Z|00081|binding|INFO|252e1f93-8a26-43a8-aabc-14548c4d04d5: Claiming fa:16:3e:a3:4d:08 10.100.0.11
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.382 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.384 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.395 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:4d:08 10.100.0.11'], port_security=['fa:16:3e:a3:4d:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e3ae83e1-a1df-447d-aeb0-61a2999954d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d9e93b058648dd9344a83f8a43b553', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9defcda-e93a-4e7e-be77-c7d66d072ae7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5816f6f1-5324-4cc2-baa5-3625180de3c2, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=252e1f93-8a26-43a8-aabc-14548c4d04d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.397 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 252e1f93-8a26-43a8-aabc-14548c4d04d5 in datapath 708cab4e-4abe-44cd-827b-483dc7e11b5a bound to our chassis#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.398 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 708cab4e-4abe-44cd-827b-483dc7e11b5a#033[00m
Jan 26 13:10:10 np0005596062 systemd-udevd[234524]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:10:10 np0005596062 systemd-machined[195380]: New machine qemu-6-instance-00000009.
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.410 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e32cb169-b355-4e79-bf64-908eeb0ccc9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.411 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap708cab4e-41 in ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.412 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap708cab4e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.413 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[90755a22-2405-4487-a1cf-9f06a1e8ef08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.413 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[298a2b94-9e72-4675-b153-3d06ee0ce3e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 NetworkManager[48993]: <info>  [1769451010.4209] device (tap252e1f93-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:10:10 np0005596062 NetworkManager[48993]: <info>  [1769451010.4222] device (tap252e1f93-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.425 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[e463098d-455b-4772-bca9-2c98a38daf21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 systemd[1]: Started Virtual Machine qemu-6-instance-00000009.
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.448 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:10Z|00082|binding|INFO|Setting lport 252e1f93-8a26-43a8-aabc-14548c4d04d5 ovn-installed in OVS
Jan 26 13:10:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:10Z|00083|binding|INFO|Setting lport 252e1f93-8a26-43a8-aabc-14548c4d04d5 up in Southbound
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.452 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[33da23b7-6e3c-49ba-948d-78e41f525159]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.454 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.476 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f995d727-661a-4157-ad34-7b0c6159b584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.482 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[31ff99f4-b459-4516-b72f-440286928b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 systemd-udevd[234529]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:10:10 np0005596062 NetworkManager[48993]: <info>  [1769451010.4833] manager: (tap708cab4e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.508 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[22a38e3b-d800-463a-8325-4a8261969f23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.511 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[cc5e6002-31b2-4f41-b208-05cf7b9d1fcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 NetworkManager[48993]: <info>  [1769451010.5295] device (tap708cab4e-40): carrier: link connected
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.534 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[b39c3fc0-4000-4bc6-95bc-d6ba031271f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.550 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[44e7394c-9276-44b1-9ec5-96df2e4f53e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap708cab4e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:4d:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477058, 'reachable_time': 42723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234559, 'error': None, 'target': 'ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.563 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6ac37b-2383-42ea-b59b-a00fb437d3ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:4deb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477058, 'tstamp': 477058}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234560, 'error': None, 'target': 'ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.578 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9983d5-6ef9-42d8-9014-dd87180db4b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap708cab4e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:4d:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477058, 'reachable_time': 42723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234561, 'error': None, 'target': 'ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.604 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[66be007d-5142-4f0a-86f3-b49ec8fc8d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.664 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[25070772-cc57-4fdb-b871-34b49289fba3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.666 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap708cab4e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.666 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.666 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap708cab4e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:10 np0005596062 kernel: tap708cab4e-40: entered promiscuous mode
Jan 26 13:10:10 np0005596062 NetworkManager[48993]: <info>  [1769451010.7311] manager: (tap708cab4e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.732 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap708cab4e-40, col_values=(('external_ids', {'iface-id': '6cfbb509-589e-4555-aa4f-cf9fa6666285'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.729 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.733 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:10Z|00084|binding|INFO|Releasing lport 6cfbb509-589e-4555-aa4f-cf9fa6666285 from this chassis (sb_readonly=0)
Jan 26 13:10:10 np0005596062 nova_compute[227313]: 2026-01-26 18:10:10.751 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.752 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/708cab4e-4abe-44cd-827b-483dc7e11b5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/708cab4e-4abe-44cd-827b-483dc7e11b5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.753 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0d80c74d-32d7-4779-9366-4602e8f54e38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.753 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-708cab4e-4abe-44cd-827b-483dc7e11b5a
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/708cab4e-4abe-44cd-827b-483dc7e11b5a.pid.haproxy
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 708cab4e-4abe-44cd-827b-483dc7e11b5a
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:10:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:10.754 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'env', 'PROCESS_TAG=haproxy-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/708cab4e-4abe-44cd-827b-483dc7e11b5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:10:11 np0005596062 podman[234629]: 2026-01-26 18:10:11.116064922 +0000 UTC m=+0.024096273 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.301 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451011.3010483, e3ae83e1-a1df-447d-aeb0-61a2999954d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.302 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] VM Started (Lifecycle Event)#033[00m
Jan 26 13:10:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:11.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:11.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.381 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:11 np0005596062 podman[234629]: 2026-01-26 18:10:11.386368984 +0000 UTC m=+0.294400325 container create 5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.388 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451011.3013065, e3ae83e1-a1df-447d-aeb0-61a2999954d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.388 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:10:11 np0005596062 systemd[1]: Started libpod-conmon-5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f.scope.
Jan 26 13:10:11 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:10:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72726cc5c1707038c21cf96d40eda200d5c5bec3534071de02fa36e55a0cece1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:10:11 np0005596062 podman[234629]: 2026-01-26 18:10:11.490493539 +0000 UTC m=+0.398524920 container init 5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 13:10:11 np0005596062 podman[234629]: 2026-01-26 18:10:11.496030106 +0000 UTC m=+0.404061447 container start 5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 13:10:11 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [NOTICE]   (234654) : New worker (234656) forked
Jan 26 13:10:11 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [NOTICE]   (234654) : Loading success.
Jan 26 13:10:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.644 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.649 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:10:11 np0005596062 nova_compute[227313]: 2026-01-26 18:10:11.675 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.668 227317 DEBUG nova.compute.manager [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.669 227317 DEBUG oslo_concurrency.lockutils [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.670 227317 DEBUG oslo_concurrency.lockutils [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.670 227317 DEBUG oslo_concurrency.lockutils [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.671 227317 DEBUG nova.compute.manager [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Processing event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.671 227317 DEBUG nova.compute.manager [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.672 227317 DEBUG oslo_concurrency.lockutils [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.672 227317 DEBUG oslo_concurrency.lockutils [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.673 227317 DEBUG oslo_concurrency.lockutils [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.674 227317 DEBUG nova.compute.manager [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] No waiting events found dispatching network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.674 227317 WARNING nova.compute.manager [req-937dee54-3479-4f91-bcdd-04ee5e2d4201 req-2d737173-ced1-4ae9-bc6e-8d64ef506d23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received unexpected event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 for instance with vm_state building and task_state spawning.#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.675 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.680 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451012.6798034, e3ae83e1-a1df-447d-aeb0-61a2999954d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.681 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.684 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.688 227317 INFO nova.virt.libvirt.driver [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Instance spawned successfully.#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.689 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.749 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.756 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.760 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.760 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.761 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.762 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.762 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.763 227317 DEBUG nova.virt.libvirt.driver [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.803 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.936 227317 INFO nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Took 10.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:10:12 np0005596062 nova_compute[227313]: 2026-01-26 18:10:12.936 227317 DEBUG nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:13 np0005596062 nova_compute[227313]: 2026-01-26 18:10:13.009 227317 INFO nova.compute.manager [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Took 12.14 seconds to build instance.#033[00m
Jan 26 13:10:13 np0005596062 nova_compute[227313]: 2026-01-26 18:10:13.075 227317 DEBUG oslo_concurrency.lockutils [None req-3f82c17d-50e9-4ad1-b740-72bc59d61244 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:13.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:13.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:13 np0005596062 nova_compute[227313]: 2026-01-26 18:10:13.595 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:14 np0005596062 nova_compute[227313]: 2026-01-26 18:10:14.857 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:15.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:15.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:17.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:17.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:17 np0005596062 nova_compute[227313]: 2026-01-26 18:10:17.535 227317 DEBUG nova.compute.manager [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:17 np0005596062 nova_compute[227313]: 2026-01-26 18:10:17.585 227317 INFO nova.compute.manager [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] instance snapshotting#033[00m
Jan 26 13:10:17 np0005596062 nova_compute[227313]: 2026-01-26 18:10:17.768 227317 INFO nova.virt.libvirt.driver [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Beginning live snapshot process#033[00m
Jan 26 13:10:18 np0005596062 nova_compute[227313]: 2026-01-26 18:10:18.075 227317 DEBUG nova.virt.libvirt.imagebackend [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] No parent info for 57de5960-c1c5-4cfa-af34-8f58cf25f585; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 26 13:10:18 np0005596062 nova_compute[227313]: 2026-01-26 18:10:18.453 227317 DEBUG nova.storage.rbd_utils [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] creating snapshot(d11fc5931f584799820a9dfb325cdfe0) on rbd image(e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:10:18 np0005596062 nova_compute[227313]: 2026-01-26 18:10:18.597 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:10:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:10:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:10:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:10:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:10:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 26 13:10:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:19.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:19.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:19 np0005596062 nova_compute[227313]: 2026-01-26 18:10:19.442 227317 DEBUG nova.storage.rbd_utils [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] cloning vms/e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk@d11fc5931f584799820a9dfb325cdfe0 to images/1d595f5f-1dd5-4fa4-b49e-c4fcc5623e63 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 26 13:10:19 np0005596062 nova_compute[227313]: 2026-01-26 18:10:19.768 227317 DEBUG nova.storage.rbd_utils [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] flattening images/1d595f5f-1dd5-4fa4-b49e-c4fcc5623e63 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 26 13:10:19 np0005596062 nova_compute[227313]: 2026-01-26 18:10:19.859 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:20 np0005596062 nova_compute[227313]: 2026-01-26 18:10:20.675 227317 DEBUG nova.storage.rbd_utils [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] removing snapshot(d11fc5931f584799820a9dfb325cdfe0) on rbd image(e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 26 13:10:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:21.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:10:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:21.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:10:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 26 13:10:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:21 np0005596062 nova_compute[227313]: 2026-01-26 18:10:21.718 227317 DEBUG nova.storage.rbd_utils [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] creating snapshot(snap) on rbd image(1d595f5f-1dd5-4fa4-b49e-c4fcc5623e63) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:10:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 26 13:10:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:23.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:23.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:23 np0005596062 nova_compute[227313]: 2026-01-26 18:10:23.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:24 np0005596062 nova_compute[227313]: 2026-01-26 18:10:24.861 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:24 np0005596062 nova_compute[227313]: 2026-01-26 18:10:24.892 227317 INFO nova.virt.libvirt.driver [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Snapshot image upload complete#033[00m
Jan 26 13:10:24 np0005596062 nova_compute[227313]: 2026-01-26 18:10:24.893 227317 INFO nova.compute.manager [None req-5bcd03f2-4d5b-4cc4-a69f-d116ceb2d762 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Took 7.31 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 26 13:10:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:25.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:25.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:26 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:26Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:4d:08 10.100.0.11
Jan 26 13:10:26 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:26Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:4d:08 10.100.0.11
Jan 26 13:10:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:10:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:27.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:10:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:27.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 26 13:10:28 np0005596062 nova_compute[227313]: 2026-01-26 18:10:28.641 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:29 np0005596062 nova_compute[227313]: 2026-01-26 18:10:29.150 227317 DEBUG nova.compute.manager [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:29 np0005596062 nova_compute[227313]: 2026-01-26 18:10:29.202 227317 INFO nova.compute.manager [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] instance snapshotting#033[00m
Jan 26 13:10:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:29.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:29.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 26 13:10:29 np0005596062 nova_compute[227313]: 2026-01-26 18:10:29.424 227317 INFO nova.virt.libvirt.driver [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Beginning live snapshot process#033[00m
Jan 26 13:10:29 np0005596062 nova_compute[227313]: 2026-01-26 18:10:29.584 227317 DEBUG nova.virt.libvirt.imagebackend [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] No parent info for 57de5960-c1c5-4cfa-af34-8f58cf25f585; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 26 13:10:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:10:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:10:29 np0005596062 nova_compute[227313]: 2026-01-26 18:10:29.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:29 np0005596062 nova_compute[227313]: 2026-01-26 18:10:29.892 227317 DEBUG nova.storage.rbd_utils [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] creating snapshot(a1389aecbdb6406992f216897f3adece) on rbd image(e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:10:30 np0005596062 podman[235099]: 2026-01-26 18:10:30.90206318 +0000 UTC m=+0.100893259 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 13:10:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 26 13:10:31 np0005596062 nova_compute[227313]: 2026-01-26 18:10:31.197 227317 DEBUG nova.storage.rbd_utils [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] cloning vms/e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk@a1389aecbdb6406992f216897f3adece to images/e8c9ffb4-4bc9-47a2-af24-9fbb9b932ec2 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 26 13:10:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:10:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:31.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:10:31 np0005596062 nova_compute[227313]: 2026-01-26 18:10:31.353 227317 DEBUG nova.storage.rbd_utils [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] flattening images/e8c9ffb4-4bc9-47a2-af24-9fbb9b932ec2 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 26 13:10:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:31.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:33.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:33.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:33 np0005596062 nova_compute[227313]: 2026-01-26 18:10:33.697 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:33 np0005596062 nova_compute[227313]: 2026-01-26 18:10:33.732 227317 DEBUG nova.storage.rbd_utils [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] removing snapshot(a1389aecbdb6406992f216897f3adece) on rbd image(e3ae83e1-a1df-447d-aeb0-61a2999954d3_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 26 13:10:34 np0005596062 nova_compute[227313]: 2026-01-26 18:10:34.866 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 26 13:10:34 np0005596062 nova_compute[227313]: 2026-01-26 18:10:34.962 227317 DEBUG nova.storage.rbd_utils [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] creating snapshot(snap) on rbd image(e8c9ffb4-4bc9-47a2-af24-9fbb9b932ec2) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:10:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:35.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:35.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 26 13:10:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:37.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:38 np0005596062 nova_compute[227313]: 2026-01-26 18:10:38.101 227317 INFO nova.virt.libvirt.driver [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Snapshot image upload complete#033[00m
Jan 26 13:10:38 np0005596062 nova_compute[227313]: 2026-01-26 18:10:38.102 227317 INFO nova.compute.manager [None req-7e721653-0fd0-4569-9fb7-ed9fb728fb8f 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Took 8.90 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 26 13:10:38 np0005596062 nova_compute[227313]: 2026-01-26 18:10:38.755 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:38 np0005596062 podman[235212]: 2026-01-26 18:10:38.923839686 +0000 UTC m=+0.121426957 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 13:10:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:10:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:39.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:10:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:39.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:39 np0005596062 nova_compute[227313]: 2026-01-26 18:10:39.914 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 26 13:10:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.004000106s ======
Jan 26 13:10:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000106s
Jan 26 13:10:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:41.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 26 13:10:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:43.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:43.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.632 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.632 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.633 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.633 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.634 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.636 227317 INFO nova.compute.manager [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Terminating instance#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.637 227317 DEBUG nova.compute.manager [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:10:43 np0005596062 kernel: tap252e1f93-8a (unregistering): left promiscuous mode
Jan 26 13:10:43 np0005596062 NetworkManager[48993]: <info>  [1769451043.7294] device (tap252e1f93-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:10:43 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:43Z|00085|binding|INFO|Releasing lport 252e1f93-8a26-43a8-aabc-14548c4d04d5 from this chassis (sb_readonly=0)
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.745 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:43 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:43Z|00086|binding|INFO|Setting lport 252e1f93-8a26-43a8-aabc-14548c4d04d5 down in Southbound
Jan 26 13:10:43 np0005596062 ovn_controller[133984]: 2026-01-26T18:10:43Z|00087|binding|INFO|Removing iface tap252e1f93-8a ovn-installed in OVS
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.749 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.757 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:43.755 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:4d:08 10.100.0.11'], port_security=['fa:16:3e:a3:4d:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e3ae83e1-a1df-447d-aeb0-61a2999954d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d9e93b058648dd9344a83f8a43b553', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9defcda-e93a-4e7e-be77-c7d66d072ae7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5816f6f1-5324-4cc2-baa5-3625180de3c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=252e1f93-8a26-43a8-aabc-14548c4d04d5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:10:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:43.757 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 252e1f93-8a26-43a8-aabc-14548c4d04d5 in datapath 708cab4e-4abe-44cd-827b-483dc7e11b5a unbound from our chassis#033[00m
Jan 26 13:10:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:43.758 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 708cab4e-4abe-44cd-827b-483dc7e11b5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:10:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:43.760 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[28c38954-eab9-477c-af9e-96e045829a25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:43.760 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a namespace which is not needed anymore#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.767 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:43 np0005596062 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 26 13:10:43 np0005596062 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Consumed 16.033s CPU time.
Jan 26 13:10:43 np0005596062 systemd-machined[195380]: Machine qemu-6-instance-00000009 terminated.
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.880 227317 INFO nova.virt.libvirt.driver [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Instance destroyed successfully.#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.881 227317 DEBUG nova.objects.instance [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lazy-loading 'resources' on Instance uuid e3ae83e1-a1df-447d-aeb0-61a2999954d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.898 227317 DEBUG nova.virt.libvirt.vif [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-667418438',display_name='tempest-ImagesOneServerTestJSON-server-667418438',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-667418438',id=9,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:10:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5d9e93b058648dd9344a83f8a43b553',ramdisk_id='',reservation_id='r-ilez556x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-410481541',owner_user_name='tempest-ImagesOneServerTestJSON-410481541-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:10:38Z,user_data=None,user_id='6edbbcec1ba44e8e815998a86fd7dcbb',uuid=e3ae83e1-a1df-447d-aeb0-61a2999954d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.898 227317 DEBUG nova.network.os_vif_util [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Converting VIF {"id": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "address": "fa:16:3e:a3:4d:08", "network": {"id": "708cab4e-4abe-44cd-827b-483dc7e11b5a", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1555594411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a5d9e93b058648dd9344a83f8a43b553", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap252e1f93-8a", "ovs_interfaceid": "252e1f93-8a26-43a8-aabc-14548c4d04d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.902 227317 DEBUG nova.network.os_vif_util [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.903 227317 DEBUG os_vif [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.905 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.905 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap252e1f93-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.908 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.911 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:10:43 np0005596062 nova_compute[227313]: 2026-01-26 18:10:43.914 227317 INFO os_vif [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:4d:08,bridge_name='br-int',has_traffic_filtering=True,id=252e1f93-8a26-43a8-aabc-14548c4d04d5,network=Network(708cab4e-4abe-44cd-827b-483dc7e11b5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap252e1f93-8a')#033[00m
Jan 26 13:10:43 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [NOTICE]   (234654) : haproxy version is 2.8.14-c23fe91
Jan 26 13:10:43 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [NOTICE]   (234654) : path to executable is /usr/sbin/haproxy
Jan 26 13:10:43 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [WARNING]  (234654) : Exiting Master process...
Jan 26 13:10:43 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [ALERT]    (234654) : Current worker (234656) exited with code 143 (Terminated)
Jan 26 13:10:43 np0005596062 neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a[234650]: [WARNING]  (234654) : All workers exited. Exiting... (0)
Jan 26 13:10:43 np0005596062 systemd[1]: libpod-5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f.scope: Deactivated successfully.
Jan 26 13:10:43 np0005596062 conmon[234650]: conmon 5c05022387d43e89a7e2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f.scope/container/memory.events
Jan 26 13:10:43 np0005596062 podman[235276]: 2026-01-26 18:10:43.974026892 +0000 UTC m=+0.059020334 container died 5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:10:44 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f-userdata-shm.mount: Deactivated successfully.
Jan 26 13:10:44 np0005596062 systemd[1]: var-lib-containers-storage-overlay-72726cc5c1707038c21cf96d40eda200d5c5bec3534071de02fa36e55a0cece1-merged.mount: Deactivated successfully.
Jan 26 13:10:44 np0005596062 podman[235276]: 2026-01-26 18:10:44.022609956 +0000 UTC m=+0.107603328 container cleanup 5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 13:10:44 np0005596062 systemd[1]: libpod-conmon-5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f.scope: Deactivated successfully.
Jan 26 13:10:44 np0005596062 podman[235322]: 2026-01-26 18:10:44.097939993 +0000 UTC m=+0.051398710 container remove 5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.110 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0a88d852-a6ca-41bd-9542-cf31915f8bfa]: (4, ('Mon Jan 26 06:10:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a (5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f)\n5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f\nMon Jan 26 06:10:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a (5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f)\n5c05022387d43e89a7e2d942a296596efd43a306b2f7a0f9aed4ddd5edb75a4f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.113 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[022bdc5f-7921-4cde-9dad-60622fe802c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.114 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap708cab4e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:10:44 np0005596062 kernel: tap708cab4e-40: left promiscuous mode
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.117 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.130 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.135 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fbec7831-2e4a-4cd7-8c0c-f09b877c8391]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.154 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[85dfe886-8b88-4733-b5ff-380988a36485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.155 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f538f172-4ad1-4694-8f73-a844aa4c5d2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.176 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a1bb9590-8316-4454-8503-830a63a5c385]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477052, 'reachable_time': 27730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235338, 'error': None, 'target': 'ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 systemd[1]: run-netns-ovnmeta\x2d708cab4e\x2d4abe\x2d44cd\x2d827b\x2d483dc7e11b5a.mount: Deactivated successfully.
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.182 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-708cab4e-4abe-44cd-827b-483dc7e11b5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:10:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:44.182 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[6209586a-8c39-47a9-abb7-5ca6f5b7ffe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.221 227317 DEBUG nova.compute.manager [req-2981320c-a279-4d1a-bacd-bdf4de5454a9 req-e836d184-2803-4da9-9696-e9b232ec004f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-vif-unplugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.221 227317 DEBUG oslo_concurrency.lockutils [req-2981320c-a279-4d1a-bacd-bdf4de5454a9 req-e836d184-2803-4da9-9696-e9b232ec004f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.221 227317 DEBUG oslo_concurrency.lockutils [req-2981320c-a279-4d1a-bacd-bdf4de5454a9 req-e836d184-2803-4da9-9696-e9b232ec004f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.222 227317 DEBUG oslo_concurrency.lockutils [req-2981320c-a279-4d1a-bacd-bdf4de5454a9 req-e836d184-2803-4da9-9696-e9b232ec004f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.222 227317 DEBUG nova.compute.manager [req-2981320c-a279-4d1a-bacd-bdf4de5454a9 req-e836d184-2803-4da9-9696-e9b232ec004f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] No waiting events found dispatching network-vif-unplugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.222 227317 DEBUG nova.compute.manager [req-2981320c-a279-4d1a-bacd-bdf4de5454a9 req-e836d184-2803-4da9-9696-e9b232ec004f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-vif-unplugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:10:44 np0005596062 nova_compute[227313]: 2026-01-26 18:10:44.916 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:45.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:45.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:46 np0005596062 nova_compute[227313]: 2026-01-26 18:10:46.328 227317 DEBUG nova.compute.manager [req-ee6981c7-a5ee-4c90-856e-6defc8a23a9e req-3bd50258-99db-498f-ba42-547e0ee41473 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:10:46 np0005596062 nova_compute[227313]: 2026-01-26 18:10:46.329 227317 DEBUG oslo_concurrency.lockutils [req-ee6981c7-a5ee-4c90-856e-6defc8a23a9e req-3bd50258-99db-498f-ba42-547e0ee41473 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:46 np0005596062 nova_compute[227313]: 2026-01-26 18:10:46.329 227317 DEBUG oslo_concurrency.lockutils [req-ee6981c7-a5ee-4c90-856e-6defc8a23a9e req-3bd50258-99db-498f-ba42-547e0ee41473 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:46 np0005596062 nova_compute[227313]: 2026-01-26 18:10:46.329 227317 DEBUG oslo_concurrency.lockutils [req-ee6981c7-a5ee-4c90-856e-6defc8a23a9e req-3bd50258-99db-498f-ba42-547e0ee41473 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:46 np0005596062 nova_compute[227313]: 2026-01-26 18:10:46.330 227317 DEBUG nova.compute.manager [req-ee6981c7-a5ee-4c90-856e-6defc8a23a9e req-3bd50258-99db-498f-ba42-547e0ee41473 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] No waiting events found dispatching network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:10:46 np0005596062 nova_compute[227313]: 2026-01-26 18:10:46.330 227317 WARNING nova.compute.manager [req-ee6981c7-a5ee-4c90-856e-6defc8a23a9e req-3bd50258-99db-498f-ba42-547e0ee41473 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received unexpected event network-vif-plugged-252e1f93-8a26-43a8-aabc-14548c4d04d5 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:10:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:10:46 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2891518576' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:10:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:47.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:47.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:47 np0005596062 nova_compute[227313]: 2026-01-26 18:10:47.770 227317 INFO nova.virt.libvirt.driver [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Deleting instance files /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3_del#033[00m
Jan 26 13:10:47 np0005596062 nova_compute[227313]: 2026-01-26 18:10:47.771 227317 INFO nova.virt.libvirt.driver [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Deletion of /var/lib/nova/instances/e3ae83e1-a1df-447d-aeb0-61a2999954d3_del complete#033[00m
Jan 26 13:10:47 np0005596062 nova_compute[227313]: 2026-01-26 18:10:47.841 227317 INFO nova.compute.manager [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Took 4.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:10:47 np0005596062 nova_compute[227313]: 2026-01-26 18:10:47.842 227317 DEBUG oslo.service.loopingcall [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:10:47 np0005596062 nova_compute[227313]: 2026-01-26 18:10:47.842 227317 DEBUG nova.compute.manager [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:10:47 np0005596062 nova_compute[227313]: 2026-01-26 18:10:47.843 227317 DEBUG nova.network.neutron [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.483 227317 DEBUG nova.network.neutron [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.504 227317 INFO nova.compute.manager [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Took 0.66 seconds to deallocate network for instance.#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.557 227317 DEBUG nova.compute.manager [req-2372ac06-6502-4303-b866-b741fbb05cf4 req-12f2006c-d5a0-4356-a250-bab143f5688b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Received event network-vif-deleted-252e1f93-8a26-43a8-aabc-14548c4d04d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.559 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.559 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.614 227317 DEBUG oslo_concurrency.processutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:48 np0005596062 nova_compute[227313]: 2026-01-26 18:10:48.909 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:10:49 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3731370380' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.148 227317 DEBUG oslo_concurrency.processutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.157 227317 DEBUG nova.compute.provider_tree [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.174 227317 DEBUG nova.scheduler.client.report [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.204 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.231 227317 INFO nova.scheduler.client.report [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Deleted allocations for instance e3ae83e1-a1df-447d-aeb0-61a2999954d3#033[00m
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.306 227317 DEBUG oslo_concurrency.lockutils [None req-1e30fca4-25bb-4d39-85e5-23ddf3f7abb9 6edbbcec1ba44e8e815998a86fd7dcbb a5d9e93b058648dd9344a83f8a43b553 - - default default] Lock "e3ae83e1-a1df-447d-aeb0-61a2999954d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:49.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:49.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:49 np0005596062 nova_compute[227313]: 2026-01-26 18:10:49.918 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 26 13:10:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:51.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:51.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:51 np0005596062 nova_compute[227313]: 2026-01-26 18:10:51.873 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "c42862e3-817c-4dff-a467-eb6a68749618" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:51 np0005596062 nova_compute[227313]: 2026-01-26 18:10:51.874 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:51 np0005596062 nova_compute[227313]: 2026-01-26 18:10:51.894 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.103 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.104 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.113 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.114 227317 INFO nova.compute.claims [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.207 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:10:52 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1240629555' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.663 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.670 227317 DEBUG nova.compute.provider_tree [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.692 227317 DEBUG nova.scheduler.client.report [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.715 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.716 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.771 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.772 227317 DEBUG nova.network.neutron [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.802 227317 INFO nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.822 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.920 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.922 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.923 227317 INFO nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Creating image(s)#033[00m
Jan 26 13:10:52 np0005596062 nova_compute[227313]: 2026-01-26 18:10:52.968 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.012 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.049 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.054 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.091 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.150 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.150 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.151 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.152 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.184 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.188 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 c42862e3-817c-4dff-a467-eb6a68749618_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.253 227317 DEBUG nova.network.neutron [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.253 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:10:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:53.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:53.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:53 np0005596062 nova_compute[227313]: 2026-01-26 18:10:53.914 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.126 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 c42862e3-817c-4dff-a467-eb6a68749618_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.938s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.230 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] resizing rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.297 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.298 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.298 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.396 227317 DEBUG nova.objects.instance [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lazy-loading 'migration_context' on Instance uuid c42862e3-817c-4dff-a467-eb6a68749618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.412 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.412 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Ensure instance console log exists: /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.413 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.413 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.414 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.415 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.419 227317 WARNING nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.424 227317 DEBUG nova.virt.libvirt.host [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.425 227317 DEBUG nova.virt.libvirt.host [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.427 227317 DEBUG nova.virt.libvirt.host [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.428 227317 DEBUG nova.virt.libvirt.host [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.429 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.429 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.430 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.430 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.430 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.430 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.430 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.431 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.431 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.431 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.431 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.432 227317 DEBUG nova.virt.hardware [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.435 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.652 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:54 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:54.653 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:10:54 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:10:54.654 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:10:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:10:54 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3060958046' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.878 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.914 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.919 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:54 np0005596062 nova_compute[227313]: 2026-01-26 18:10:54.946 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:10:55 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/371365968' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.362 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.363 227317 DEBUG nova.objects.instance [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lazy-loading 'pci_devices' on Instance uuid c42862e3-817c-4dff-a467-eb6a68749618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:10:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:55.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.379 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <uuid>c42862e3-817c-4dff-a467-eb6a68749618</uuid>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <name>instance-0000000a</name>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:name>tempest-MigrationsAdminTest-server-998560965</nova:name>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:10:54</nova:creationTime>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:user uuid="f7f416104f314e4db87eda6b639ad3e0">tempest-MigrationsAdminTest-1151354117-project-member</nova:user>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <nova:project uuid="9b25b015b6314a2baab5bf794d8d5526">tempest-MigrationsAdminTest-1151354117</nova:project>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <nova:ports/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <entry name="serial">c42862e3-817c-4dff-a467-eb6a68749618</entry>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <entry name="uuid">c42862e3-817c-4dff-a467-eb6a68749618</entry>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/c42862e3-817c-4dff-a467-eb6a68749618_disk">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/c42862e3-817c-4dff-a467-eb6a68749618_disk.config">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/console.log" append="off"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:10:55 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:10:55 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:10:55 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:10:55 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:10:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:55.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.607 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.607 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.608 227317 INFO nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Using config drive#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.638 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.882 227317 INFO nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Creating config drive at /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/disk.config#033[00m
Jan 26 13:10:55 np0005596062 nova_compute[227313]: 2026-01-26 18:10:55.886 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2li2ei5q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.014 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2li2ei5q" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.045 227317 DEBUG nova.storage.rbd_utils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rbd image c42862e3-817c-4dff-a467-eb6a68749618_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.052 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/disk.config c42862e3-817c-4dff-a467-eb6a68749618_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.089 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.090 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.091 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.235 227317 DEBUG oslo_concurrency.processutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/disk.config c42862e3-817c-4dff-a467-eb6a68749618_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.236 227317 INFO nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Deleting local config drive /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/disk.config because it was imported into RBD.#033[00m
Jan 26 13:10:56 np0005596062 systemd-machined[195380]: New machine qemu-7-instance-0000000a.
Jan 26 13:10:56 np0005596062 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Jan 26 13:10:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.706 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451056.7063715, c42862e3-817c-4dff-a467-eb6a68749618 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.709 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.716 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.717 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.721 227317 INFO nova.virt.libvirt.driver [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance spawned successfully.#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.722 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.741 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.746 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.746 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.747 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.747 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.747 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.748 227317 DEBUG nova.virt.libvirt.driver [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.752 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.787 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.788 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451056.707583, c42862e3-817c-4dff-a467-eb6a68749618 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.788 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] VM Started (Lifecycle Event)#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.811 227317 INFO nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Took 3.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.811 227317 DEBUG nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.812 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.820 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.856 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.875 227317 INFO nova.compute.manager [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Took 4.79 seconds to build instance.#033[00m
Jan 26 13:10:56 np0005596062 nova_compute[227313]: 2026-01-26 18:10:56.899 227317 DEBUG oslo_concurrency.lockutils [None req-7f263c54-ea1a-4abe-87d6-ceddccad9d19 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.354 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.354 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.354 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.355 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c42862e3-817c-4dff-a467-eb6a68749618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:10:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 13:10:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:57.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 13:10:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:10:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:57.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.563 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.932 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.960 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.960 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.961 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.991 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.992 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.993 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.993 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:10:57 np0005596062 nova_compute[227313]: 2026-01-26 18:10:57.994 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:10:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3876967363' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:10:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:10:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3876967363' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:10:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:10:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3837631056' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:10:58 np0005596062 nova_compute[227313]: 2026-01-26 18:10:58.498 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:58.964 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:58.966 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451043.8760157, e3ae83e1-a1df-447d-aeb0-61a2999954d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:58.967 227317 INFO nova.compute.manager [-] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:58.986 227317 DEBUG nova.compute.manager [None req-5767e375-7d9b-4dcb-98a3-0f762d5e1531 - - - - - -] [instance: e3ae83e1-a1df-447d-aeb0-61a2999954d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:10:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:10:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:10:59.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:10:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:10:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:10:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:10:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.550 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.551 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.701 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.702 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4723MB free_disk=20.92178726196289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.703 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.703 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.811 227317 INFO nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating resource usage from migration 6a0521b3-fe4c-4b81-b349-864a0b7618c6#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.869 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Migration 6a0521b3-fe4c-4b81-b349-864a0b7618c6 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.870 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.870 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.916 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:10:59 np0005596062 nova_compute[227313]: 2026-01-26 18:10:59.949 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:11:00 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2126254641' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.422 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.427 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.438 227317 DEBUG oslo_concurrency.lockutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Acquiring lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.438 227317 DEBUG oslo_concurrency.lockutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Acquired lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.438 227317 DEBUG nova.network.neutron [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.455 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.483 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.484 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.781s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.573 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:00 np0005596062 nova_compute[227313]: 2026-01-26 18:11:00.654 227317 DEBUG nova.network.neutron [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:11:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:00.656 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.266 227317 DEBUG nova.network.neutron [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.283 227317 DEBUG oslo_concurrency.lockutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Releasing lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.371 227317 DEBUG nova.virt.libvirt.driver [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.372 227317 DEBUG nova.virt.libvirt.volume.remotefs [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Creating file /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/617521f44f19434c93de04036f6eeb82.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.372 227317 DEBUG oslo_concurrency.processutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/617521f44f19434c93de04036f6eeb82.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:11:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:01.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:01.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.807 227317 DEBUG oslo_concurrency.processutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/617521f44f19434c93de04036f6eeb82.tmp" returned: 1 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.808 227317 DEBUG oslo_concurrency.processutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/617521f44f19434c93de04036f6eeb82.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.808 227317 DEBUG nova.virt.libvirt.volume.remotefs [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Creating directory /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 26 13:11:01 np0005596062 nova_compute[227313]: 2026-01-26 18:11:01.809 227317 DEBUG oslo_concurrency.processutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:11:01 np0005596062 podman[235833]: 2026-01-26 18:11:01.871036393 +0000 UTC m=+0.084836961 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:11:02 np0005596062 nova_compute[227313]: 2026-01-26 18:11:02.026 227317 DEBUG oslo_concurrency.processutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:02 np0005596062 nova_compute[227313]: 2026-01-26 18:11:02.029 227317 DEBUG nova.virt.libvirt.driver [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 26 13:11:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:03.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:03.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:03 np0005596062 nova_compute[227313]: 2026-01-26 18:11:03.966 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:04 np0005596062 nova_compute[227313]: 2026-01-26 18:11:04.924 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:05.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:05.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:07.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:07.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:08 np0005596062 nova_compute[227313]: 2026-01-26 18:11:08.969 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:09.160 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:09.161 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:09.161 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:09.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:09.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:09 np0005596062 podman[235906]: 2026-01-26 18:11:09.926078858 +0000 UTC m=+0.130628671 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 26 13:11:09 np0005596062 nova_compute[227313]: 2026-01-26 18:11:09.927 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:11.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:11.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:12 np0005596062 nova_compute[227313]: 2026-01-26 18:11:12.073 227317 DEBUG nova.virt.libvirt.driver [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 26 13:11:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:13.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:13 np0005596062 nova_compute[227313]: 2026-01-26 18:11:13.971 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:14 np0005596062 nova_compute[227313]: 2026-01-26 18:11:14.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:15.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:15.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:17.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:17 np0005596062 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 26 13:11:17 np0005596062 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 13.935s CPU time.
Jan 26 13:11:17 np0005596062 systemd-machined[195380]: Machine qemu-7-instance-0000000a terminated.
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.104 227317 INFO nova.virt.libvirt.driver [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance shutdown successfully after 16 seconds.#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.110 227317 INFO nova.virt.libvirt.driver [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance destroyed successfully.#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.114 227317 DEBUG nova.virt.libvirt.driver [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.114 227317 DEBUG nova.virt.libvirt.driver [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.223 227317 DEBUG oslo_concurrency.lockutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Acquiring lock "c42862e3-817c-4dff-a467-eb6a68749618-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.224 227317 DEBUG oslo_concurrency.lockutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.224 227317 DEBUG oslo_concurrency.lockutils [None req-be032121-6f8b-4752-8449-9b6cc43a057a ff5dc64803234e5a8aeec49d4f2146c5 ad485adf7ccc464bb29aa8589226ae53 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:18 np0005596062 nova_compute[227313]: 2026-01-26 18:11:18.975 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:19.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:19.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:19 np0005596062 nova_compute[227313]: 2026-01-26 18:11:19.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 26 13:11:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:21.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:21.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:23.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:23.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:23 np0005596062 nova_compute[227313]: 2026-01-26 18:11:23.977 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.334814) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451084334875, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2353, "num_deletes": 254, "total_data_size": 5618638, "memory_usage": 5698816, "flush_reason": "Manual Compaction"}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451084362130, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3672252, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23360, "largest_seqno": 25708, "table_properties": {"data_size": 3662722, "index_size": 6024, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19888, "raw_average_key_size": 20, "raw_value_size": 3643544, "raw_average_value_size": 3756, "num_data_blocks": 267, "num_entries": 970, "num_filter_entries": 970, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769450884, "oldest_key_time": 1769450884, "file_creation_time": 1769451084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 27364 microseconds, and 14446 cpu microseconds.
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.362179) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3672252 bytes OK
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.362196) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.363466) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.363478) EVENT_LOG_v1 {"time_micros": 1769451084363474, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.363494) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5608285, prev total WAL file size 5608285, number of live WAL files 2.
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.364619) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3586KB)], [48(7758KB)]
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451084364660, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 11616924, "oldest_snapshot_seqno": -1}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 4995 keys, 9565344 bytes, temperature: kUnknown
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451084432985, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 9565344, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9530311, "index_size": 21419, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126332, "raw_average_key_size": 25, "raw_value_size": 9438340, "raw_average_value_size": 1889, "num_data_blocks": 877, "num_entries": 4995, "num_filter_entries": 4995, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.433252) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 9565344 bytes
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.434683) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.7 rd, 139.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 7.6 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(5.8) write-amplify(2.6) OK, records in: 5518, records dropped: 523 output_compression: NoCompression
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.434733) EVENT_LOG_v1 {"time_micros": 1769451084434720, "job": 28, "event": "compaction_finished", "compaction_time_micros": 68438, "compaction_time_cpu_micros": 22103, "output_level": 6, "num_output_files": 1, "total_output_size": 9565344, "num_input_records": 5518, "num_output_records": 4995, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451084435814, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451084437527, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.364558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.437646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.437652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.437654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.437656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:11:24 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:11:24.437658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:11:24 np0005596062 nova_compute[227313]: 2026-01-26 18:11:24.933 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:25.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:25.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:27.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:27.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:28 np0005596062 nova_compute[227313]: 2026-01-26 18:11:28.980 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:29.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:29.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:29 np0005596062 nova_compute[227313]: 2026-01-26 18:11:29.967 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:30 np0005596062 podman[236168]: 2026-01-26 18:11:30.037883454 +0000 UTC m=+0.061917591 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 13:11:30 np0005596062 podman[236168]: 2026-01-26 18:11:30.142251215 +0000 UTC m=+0.166285312 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 13:11:30 np0005596062 podman[236323]: 2026-01-26 18:11:30.852637254 +0000 UTC m=+0.045932195 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:11:30 np0005596062 podman[236323]: 2026-01-26 18:11:30.861914561 +0000 UTC m=+0.055209482 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:11:31 np0005596062 podman[236389]: 2026-01-26 18:11:31.11106479 +0000 UTC m=+0.075177214 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, name=keepalived, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, release=1793, version=2.2.4, architecture=x86_64)
Jan 26 13:11:31 np0005596062 podman[236389]: 2026-01-26 18:11:31.123167053 +0000 UTC m=+0.087279467 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., name=keepalived, vcs-type=git, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, com.redhat.component=keepalived-container, version=2.2.4)
Jan 26 13:11:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:31.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:31 np0005596062 nova_compute[227313]: 2026-01-26 18:11:31.521 227317 INFO nova.compute.manager [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Swapping old allocation on dict_keys(['65600a65-69bc-488c-8c8c-71cbf43e523a']) held by migration 6a0521b3-fe4c-4b81-b349-864a0b7618c6 for instance#033[00m
Jan 26 13:11:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:31.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:31 np0005596062 nova_compute[227313]: 2026-01-26 18:11:31.553 227317 DEBUG nova.scheduler.client.report [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Overwriting current allocation {'allocations': {'5b280bb3-8c5a-4e32-a75a-9ddf74821dbd': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 21}}, 'project_id': '9b25b015b6314a2baab5bf794d8d5526', 'user_id': 'f7f416104f314e4db87eda6b639ad3e0', 'consumer_generation': 1} on consumer c42862e3-817c-4dff-a467-eb6a68749618 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 26 13:11:31 np0005596062 nova_compute[227313]: 2026-01-26 18:11:31.786 227317 DEBUG oslo_concurrency.lockutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:11:31 np0005596062 nova_compute[227313]: 2026-01-26 18:11:31.787 227317 DEBUG oslo_concurrency.lockutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquired lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:11:31 np0005596062 nova_compute[227313]: 2026-01-26 18:11:31.787 227317 DEBUG nova.network.neutron [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:11:31 np0005596062 nova_compute[227313]: 2026-01-26 18:11:31.965 227317 DEBUG nova.network.neutron [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:11:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:32 np0005596062 ovn_controller[133984]: 2026-01-26T18:11:32Z|00088|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 13:11:32 np0005596062 nova_compute[227313]: 2026-01-26 18:11:32.255 227317 DEBUG nova.network.neutron [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:11:32 np0005596062 nova_compute[227313]: 2026-01-26 18:11:32.283 227317 DEBUG oslo_concurrency.lockutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Releasing lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:11:32 np0005596062 nova_compute[227313]: 2026-01-26 18:11:32.284 227317 DEBUG nova.virt.libvirt.driver [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 26 13:11:32 np0005596062 nova_compute[227313]: 2026-01-26 18:11:32.860 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451077.7726848, c42862e3-817c-4dff-a467-eb6a68749618 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:11:32 np0005596062 nova_compute[227313]: 2026-01-26 18:11:32.861 227317 INFO nova.compute.manager [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:11:32 np0005596062 podman[236430]: 2026-01-26 18:11:32.873748458 +0000 UTC m=+0.074693371 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 13:11:33 np0005596062 nova_compute[227313]: 2026-01-26 18:11:33.040 227317 DEBUG nova.storage.rbd_utils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] rolling back rbd image(c42862e3-817c-4dff-a467-eb6a68749618_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 26 13:11:33 np0005596062 nova_compute[227313]: 2026-01-26 18:11:33.054 227317 DEBUG nova.compute.manager [None req-b667aed8-b4f9-4005-a2c1-e0762940bd31 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:11:33 np0005596062 nova_compute[227313]: 2026-01-26 18:11:33.058 227317 DEBUG nova.compute.manager [None req-b667aed8-b4f9-4005-a2c1-e0762940bd31 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:11:33 np0005596062 nova_compute[227313]: 2026-01-26 18:11:33.083 227317 INFO nova.compute.manager [None req-b667aed8-b4f9-4005-a2c1-e0762940bd31 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 26 13:11:33 np0005596062 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 13:11:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:33.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:33.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:33 np0005596062 nova_compute[227313]: 2026-01-26 18:11:33.982 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:34 np0005596062 nova_compute[227313]: 2026-01-26 18:11:34.977 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:34 np0005596062 nova_compute[227313]: 2026-01-26 18:11:34.978 227317 DEBUG nova.storage.rbd_utils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] removing snapshot(nova-resize) on rbd image(c42862e3-817c-4dff-a467-eb6a68749618_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 26 13:11:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:35.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:35.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:37.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 26 13:11:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:37.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:11:38 np0005596062 nova_compute[227313]: 2026-01-26 18:11:38.985 227317 DEBUG nova.virt.libvirt.driver [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:11:38 np0005596062 nova_compute[227313]: 2026-01-26 18:11:38.986 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:38 np0005596062 nova_compute[227313]: 2026-01-26 18:11:38.990 227317 WARNING nova.virt.libvirt.driver [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:11:38 np0005596062 nova_compute[227313]: 2026-01-26 18:11:38.996 227317 DEBUG nova.virt.libvirt.host [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:11:38 np0005596062 nova_compute[227313]: 2026-01-26 18:11:38.997 227317 DEBUG nova.virt.libvirt.host [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.000 227317 DEBUG nova.virt.libvirt.host [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.000 227317 DEBUG nova.virt.libvirt.host [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.001 227317 DEBUG nova.virt.libvirt.driver [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.001 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.002 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.002 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.002 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.002 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.002 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.002 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.003 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.003 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.003 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.003 227317 DEBUG nova.virt.hardware [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.003 227317 DEBUG nova.objects.instance [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c42862e3-817c-4dff-a467-eb6a68749618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.218 227317 DEBUG oslo_concurrency.processutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.325 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:39.324 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:11:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:39.326 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:11:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:39.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:39.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:11:39 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1877526519' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:11:39 np0005596062 nova_compute[227313]: 2026-01-26 18:11:39.972 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:40 np0005596062 nova_compute[227313]: 2026-01-26 18:11:40.188 227317 DEBUG oslo_concurrency.processutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.970s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:40 np0005596062 nova_compute[227313]: 2026-01-26 18:11:40.235 227317 DEBUG oslo_concurrency.processutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:11:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:11:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1222380491' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:11:40 np0005596062 nova_compute[227313]: 2026-01-26 18:11:40.695 227317 DEBUG oslo_concurrency.processutils [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:40 np0005596062 nova_compute[227313]: 2026-01-26 18:11:40.702 227317 DEBUG nova.virt.libvirt.driver [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <uuid>c42862e3-817c-4dff-a467-eb6a68749618</uuid>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <name>instance-0000000a</name>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:name>tempest-MigrationsAdminTest-server-998560965</nova:name>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:11:38</nova:creationTime>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:user uuid="f7f416104f314e4db87eda6b639ad3e0">tempest-MigrationsAdminTest-1151354117-project-member</nova:user>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <nova:project uuid="9b25b015b6314a2baab5bf794d8d5526">tempest-MigrationsAdminTest-1151354117</nova:project>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <nova:ports/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <entry name="serial">c42862e3-817c-4dff-a467-eb6a68749618</entry>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <entry name="uuid">c42862e3-817c-4dff-a467-eb6a68749618</entry>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:11:40 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:11:40 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/c42862e3-817c-4dff-a467-eb6a68749618_disk">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/c42862e3-817c-4dff-a467-eb6a68749618_disk.config">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618/console.log" append="off"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <input type="keyboard" bus="usb"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:11:40 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:11:40 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:11:40 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:11:40 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:11:40 np0005596062 virtqemud[226715]: End of file while reading data: Input/output error
Jan 26 13:11:40 np0005596062 virtqemud[226715]: End of file while reading data: Input/output error
Jan 26 13:11:40 np0005596062 systemd-machined[195380]: New machine qemu-8-instance-0000000a.
Jan 26 13:11:40 np0005596062 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Jan 26 13:11:40 np0005596062 podman[236814]: 2026-01-26 18:11:40.840610012 +0000 UTC m=+0.081585725 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:11:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:41.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.468 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451101.467219, c42862e3-817c-4dff-a467-eb6a68749618 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.468 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.472 227317 DEBUG nova.compute.manager [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.476 227317 INFO nova.virt.libvirt.driver [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance running successfully.#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.477 227317 DEBUG nova.virt.libvirt.driver [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.505 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.515 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.547 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.548 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451101.4675922, c42862e3-817c-4dff-a467-eb6a68749618 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:11:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.548 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] VM Started (Lifecycle Event)#033[00m
Jan 26 13:11:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.004000106s ======
Jan 26 13:11:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:41.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000106s
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.635 227317 INFO nova.compute.manager [None req-d1b766d6-715f-428f-9522-2a7bc492fb31 f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating instance to original state: 'active'#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.676 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.679 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:11:41 np0005596062 nova_compute[227313]: 2026-01-26 18:11:41.715 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 26 13:11:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:11:42.328 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:11:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:43.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:43.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:11:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5008 writes, 25K keys, 5008 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 5008 writes, 5008 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1554 writes, 7405 keys, 1554 commit groups, 1.0 writes per commit group, ingest: 15.97 MB, 0.03 MB/s#012Interval WAL: 1554 writes, 1554 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     65.1      0.50              0.11        14    0.036       0      0       0.0       0.0#012  L6      1/0    9.12 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.3     71.1     58.5      1.86              0.36        13    0.143     62K   6923       0.0       0.0#012 Sum      1/0    9.12 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.3     56.0     59.9      2.35              0.47        27    0.087     62K   6923       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   5.1    107.5    110.4      0.47              0.16        10    0.047     26K   2556       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     71.1     58.5      1.86              0.36        13    0.143     62K   6923       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     65.3      0.50              0.11        13    0.038       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.032, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.08 MB/s write, 0.13 GB read, 0.07 MB/s read, 2.4 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 12.56 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000105 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(715,12.05 MB,3.96374%) FilterBlock(27,179.92 KB,0.0577977%) IndexBlock(27,341.11 KB,0.109577%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.761 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "c42862e3-817c-4dff-a467-eb6a68749618" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.762 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.763 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "c42862e3-817c-4dff-a467-eb6a68749618-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.763 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.763 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.764 227317 INFO nova.compute.manager [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Terminating instance#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.765 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.766 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquired lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.766 227317 DEBUG nova.network.neutron [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.964 227317 DEBUG nova.network.neutron [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:11:43 np0005596062 nova_compute[227313]: 2026-01-26 18:11:43.988 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:44 np0005596062 nova_compute[227313]: 2026-01-26 18:11:44.303 227317 DEBUG nova.network.neutron [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:11:44 np0005596062 nova_compute[227313]: 2026-01-26 18:11:44.320 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Releasing lock "refresh_cache-c42862e3-817c-4dff-a467-eb6a68749618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:11:44 np0005596062 nova_compute[227313]: 2026-01-26 18:11:44.321 227317 DEBUG nova.compute.manager [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:11:44 np0005596062 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 26 13:11:44 np0005596062 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 3.713s CPU time.
Jan 26 13:11:44 np0005596062 systemd-machined[195380]: Machine qemu-8-instance-0000000a terminated.
Jan 26 13:11:44 np0005596062 nova_compute[227313]: 2026-01-26 18:11:44.546 227317 INFO nova.virt.libvirt.driver [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance destroyed successfully.#033[00m
Jan 26 13:11:44 np0005596062 nova_compute[227313]: 2026-01-26 18:11:44.547 227317 DEBUG nova.objects.instance [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lazy-loading 'resources' on Instance uuid c42862e3-817c-4dff-a467-eb6a68749618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:11:44 np0005596062 nova_compute[227313]: 2026-01-26 18:11:44.975 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:45.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:45.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 26 13:11:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:47.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:47.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.472 227317 INFO nova.virt.libvirt.driver [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Deleting instance files /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618_del#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.473 227317 INFO nova.virt.libvirt.driver [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Deletion of /var/lib/nova/instances/c42862e3-817c-4dff-a467-eb6a68749618_del complete#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.544 227317 INFO nova.compute.manager [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Took 4.22 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.545 227317 DEBUG oslo.service.loopingcall [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.545 227317 DEBUG nova.compute.manager [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.546 227317 DEBUG nova.network.neutron [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.954 227317 DEBUG nova.network.neutron [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.992 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:48 np0005596062 nova_compute[227313]: 2026-01-26 18:11:48.995 227317 DEBUG nova.network.neutron [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.028 227317 INFO nova.compute.manager [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Took 0.48 seconds to deallocate network for instance.#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.180 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.181 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.217 227317 DEBUG nova.scheduler.client.report [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:11:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:49.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.538 227317 DEBUG nova.scheduler.client.report [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.538 227317 DEBUG nova.compute.provider_tree [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.556 227317 DEBUG nova.scheduler.client.report [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:11:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:49.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.592 227317 DEBUG nova.scheduler.client.report [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:11:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.634 227317 DEBUG oslo_concurrency.processutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:11:49 np0005596062 nova_compute[227313]: 2026-01-26 18:11:49.977 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:11:50 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3952447373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:11:50 np0005596062 nova_compute[227313]: 2026-01-26 18:11:50.144 227317 DEBUG oslo_concurrency.processutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:50 np0005596062 nova_compute[227313]: 2026-01-26 18:11:50.152 227317 DEBUG nova.compute.provider_tree [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:11:50 np0005596062 nova_compute[227313]: 2026-01-26 18:11:50.169 227317 DEBUG nova.scheduler.client.report [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:11:50 np0005596062 nova_compute[227313]: 2026-01-26 18:11:50.207 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:50 np0005596062 nova_compute[227313]: 2026-01-26 18:11:50.233 227317 INFO nova.scheduler.client.report [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Deleted allocations for instance c42862e3-817c-4dff-a467-eb6a68749618#033[00m
Jan 26 13:11:50 np0005596062 nova_compute[227313]: 2026-01-26 18:11:50.303 227317 DEBUG oslo_concurrency.lockutils [None req-1ca4d44d-3375-49f2-8629-c5e91920f91b f7f416104f314e4db87eda6b639ad3e0 9b25b015b6314a2baab5bf794d8d5526 - - default default] Lock "c42862e3-817c-4dff-a467-eb6a68749618" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:51.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:51.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:53 np0005596062 nova_compute[227313]: 2026-01-26 18:11:53.069 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:53 np0005596062 nova_compute[227313]: 2026-01-26 18:11:53.069 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:11:53 np0005596062 nova_compute[227313]: 2026-01-26 18:11:53.110 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:11:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:11:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:53.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:11:53 np0005596062 nova_compute[227313]: 2026-01-26 18:11:53.995 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:54 np0005596062 nova_compute[227313]: 2026-01-26 18:11:54.979 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:55 np0005596062 nova_compute[227313]: 2026-01-26 18:11:55.092 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:55 np0005596062 nova_compute[227313]: 2026-01-26 18:11:55.092 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:55 np0005596062 nova_compute[227313]: 2026-01-26 18:11:55.092 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:11:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:55.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:55.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:56 np0005596062 nova_compute[227313]: 2026-01-26 18:11:56.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:56 np0005596062 nova_compute[227313]: 2026-01-26 18:11:56.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:57 np0005596062 nova_compute[227313]: 2026-01-26 18:11:57.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:11:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:11:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:57.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:11:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:11:57 np0005596062 nova_compute[227313]: 2026-01-26 18:11:57.722 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:57 np0005596062 nova_compute[227313]: 2026-01-26 18:11:57.722 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:57 np0005596062 nova_compute[227313]: 2026-01-26 18:11:57.722 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:11:57 np0005596062 nova_compute[227313]: 2026-01-26 18:11:57.723 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:11:57 np0005596062 nova_compute[227313]: 2026-01-26 18:11:57.723 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:11:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:11:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3241063326' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:11:58 np0005596062 nova_compute[227313]: 2026-01-26 18:11:58.531 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.808s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:11:58 np0005596062 nova_compute[227313]: 2026-01-26 18:11:58.736 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:11:58 np0005596062 nova_compute[227313]: 2026-01-26 18:11:58.738 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4796MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:11:58 np0005596062 nova_compute[227313]: 2026-01-26 18:11:58.738 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:11:58 np0005596062 nova_compute[227313]: 2026-01-26 18:11:58.739 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:11:58 np0005596062 nova_compute[227313]: 2026-01-26 18:11:58.999 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:11:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:11:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:11:59.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:11:59 np0005596062 nova_compute[227313]: 2026-01-26 18:11:59.544 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451104.5425153, c42862e3-817c-4dff-a467-eb6a68749618 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:11:59 np0005596062 nova_compute[227313]: 2026-01-26 18:11:59.545 227317 INFO nova.compute.manager [-] [instance: c42862e3-817c-4dff-a467-eb6a68749618] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:11:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:11:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:11:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:11:59.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:11:59 np0005596062 nova_compute[227313]: 2026-01-26 18:11:59.981 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:00 np0005596062 nova_compute[227313]: 2026-01-26 18:12:00.919 227317 DEBUG nova.compute.manager [None req-e33f5e1a-d100-4a7c-a47a-a35c024afb24 - - - - - -] [instance: c42862e3-817c-4dff-a467-eb6a68749618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.034 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.034 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.270 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:01.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:01.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:12:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/103705116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.729 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.735 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.755 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.790 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.791 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:01 np0005596062 nova_compute[227313]: 2026-01-26 18:12:01.792 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.801 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.802 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.831 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.831 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.832 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.853 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.854 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:02 np0005596062 nova_compute[227313]: 2026-01-26 18:12:02.854 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:03.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:03.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:03 np0005596062 podman[237097]: 2026-01-26 18:12:03.876850115 +0000 UTC m=+0.079666224 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:12:04 np0005596062 nova_compute[227313]: 2026-01-26 18:12:04.002 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:04 np0005596062 nova_compute[227313]: 2026-01-26 18:12:04.983 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:05.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:05.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:07.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:07.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:09 np0005596062 nova_compute[227313]: 2026-01-26 18:12:09.005 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:09.161 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:09.161 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:09.161 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:09.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:09.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:09 np0005596062 nova_compute[227313]: 2026-01-26 18:12:09.985 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:11.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:11.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:11 np0005596062 podman[237172]: 2026-01-26 18:12:11.925565118 +0000 UTC m=+0.135539982 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:12:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:13.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:14 np0005596062 nova_compute[227313]: 2026-01-26 18:12:14.006 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:14 np0005596062 nova_compute[227313]: 2026-01-26 18:12:14.987 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:15.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:15.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:12:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 9797 writes, 40K keys, 9797 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 9797 writes, 2665 syncs, 3.68 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4105 writes, 16K keys, 4105 commit groups, 1.0 writes per commit group, ingest: 14.16 MB, 0.02 MB/s#012Interval WAL: 4105 writes, 1675 syncs, 2.45 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:12:16 np0005596062 ceph-mgr[77538]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2716354406
Jan 26 13:12:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:17.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:17.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.272 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.272 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.380 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.811 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.812 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.828 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:12:18 np0005596062 nova_compute[227313]: 2026-01-26 18:12:18.829 227317 INFO nova.compute.claims [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.010 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.191 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:19.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:19.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:12:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1015239358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.665 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.675 227317 DEBUG nova.compute.provider_tree [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.737 227317 DEBUG nova.scheduler.client.report [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.907 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:19 np0005596062 nova_compute[227313]: 2026-01-26 18:12:19.908 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:12:20 np0005596062 nova_compute[227313]: 2026-01-26 18:12:20.004 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:20 np0005596062 nova_compute[227313]: 2026-01-26 18:12:20.515 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:12:20 np0005596062 nova_compute[227313]: 2026-01-26 18:12:20.515 227317 DEBUG nova.network.neutron [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:12:20 np0005596062 nova_compute[227313]: 2026-01-26 18:12:20.534 227317 INFO nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:12:20 np0005596062 nova_compute[227313]: 2026-01-26 18:12:20.725 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:12:20 np0005596062 nova_compute[227313]: 2026-01-26 18:12:20.795 227317 DEBUG nova.policy [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d020da9c5434489960da2631ebbc118', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '578fc64b175945c785ac201f680d3471', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.175 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.177 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.177 227317 INFO nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Creating image(s)#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.211 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.244 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.283 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.288 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.357 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.359 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.359 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.360 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.395 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.401 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:21.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:21.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:21 np0005596062 nova_compute[227313]: 2026-01-26 18:12:21.969 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.009 227317 DEBUG nova.network.neutron [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Successfully created port: 5249aff9-7e40-4a33-ae01-3f575d4e623d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.064 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] resizing rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:12:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.189 227317 DEBUG nova.objects.instance [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lazy-loading 'migration_context' on Instance uuid 24fa04e2-99c5-450d-9be4-a80e22fcb516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.202 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.203 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Ensure instance console log exists: /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.203 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.204 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:22 np0005596062 nova_compute[227313]: 2026-01-26 18:12:22.204 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:23.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:23 np0005596062 nova_compute[227313]: 2026-01-26 18:12:23.589 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:23 np0005596062 nova_compute[227313]: 2026-01-26 18:12:23.615 227317 WARNING nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Jan 26 13:12:23 np0005596062 nova_compute[227313]: 2026-01-26 18:12:23.615 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Triggering sync for uuid 24fa04e2-99c5-450d-9be4-a80e22fcb516 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 26 13:12:23 np0005596062 nova_compute[227313]: 2026-01-26 18:12:23.615 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:23.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:24 np0005596062 nova_compute[227313]: 2026-01-26 18:12:24.013 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:24 np0005596062 nova_compute[227313]: 2026-01-26 18:12:24.919 227317 DEBUG nova.network.neutron [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Successfully updated port: 5249aff9-7e40-4a33-ae01-3f575d4e623d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:12:24 np0005596062 nova_compute[227313]: 2026-01-26 18:12:24.936 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:12:24 np0005596062 nova_compute[227313]: 2026-01-26 18:12:24.936 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquired lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:12:24 np0005596062 nova_compute[227313]: 2026-01-26 18:12:24.936 227317 DEBUG nova.network.neutron [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:12:25 np0005596062 nova_compute[227313]: 2026-01-26 18:12:25.007 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:25 np0005596062 nova_compute[227313]: 2026-01-26 18:12:25.097 227317 DEBUG nova.compute.manager [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-changed-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:12:25 np0005596062 nova_compute[227313]: 2026-01-26 18:12:25.098 227317 DEBUG nova.compute.manager [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Refreshing instance network info cache due to event network-changed-5249aff9-7e40-4a33-ae01-3f575d4e623d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:12:25 np0005596062 nova_compute[227313]: 2026-01-26 18:12:25.098 227317 DEBUG oslo_concurrency.lockutils [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:12:25 np0005596062 nova_compute[227313]: 2026-01-26 18:12:25.189 227317 DEBUG nova.network.neutron [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:12:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:25.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:25.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:27.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:27.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.807 227317 DEBUG nova.network.neutron [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updating instance_info_cache with network_info: [{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.919 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Releasing lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.919 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Instance network_info: |[{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.920 227317 DEBUG oslo_concurrency.lockutils [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.920 227317 DEBUG nova.network.neutron [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Refreshing network info cache for port 5249aff9-7e40-4a33-ae01-3f575d4e623d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.923 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Start _get_guest_xml network_info=[{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.928 227317 WARNING nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.933 227317 DEBUG nova.virt.libvirt.host [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.933 227317 DEBUG nova.virt.libvirt.host [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.936 227317 DEBUG nova.virt.libvirt.host [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.937 227317 DEBUG nova.virt.libvirt.host [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.938 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.938 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.938 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.939 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.939 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.939 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.939 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.940 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.940 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.940 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.940 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.940 227317 DEBUG nova.virt.hardware [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:12:27 np0005596062 nova_compute[227313]: 2026-01-26 18:12:27.943 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:12:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3337474700' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.399 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.431 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.436 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:12:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2971359583' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.905 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.907 227317 DEBUG nova.virt.libvirt.vif [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:12:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-775880326',display_name='tempest-SecurityGroupsTestJSON-server-775880326',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-775880326',id=12,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='578fc64b175945c785ac201f680d3471',ramdisk_id='',reservation_id='r-majveexy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1614836843',owner_user_name='tempest-SecurityGroupsTestJSON-1614836843-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:12:20Z,user_data=None,user_id='8d020da9c5434489960da2631ebbc118',uuid=24fa04e2-99c5-450d-9be4-a80e22fcb516,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.907 227317 DEBUG nova.network.os_vif_util [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Converting VIF {"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.908 227317 DEBUG nova.network.os_vif_util [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.910 227317 DEBUG nova.objects.instance [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lazy-loading 'pci_devices' on Instance uuid 24fa04e2-99c5-450d-9be4-a80e22fcb516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.978 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <uuid>24fa04e2-99c5-450d-9be4-a80e22fcb516</uuid>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <name>instance-0000000c</name>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:name>tempest-SecurityGroupsTestJSON-server-775880326</nova:name>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:12:27</nova:creationTime>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:user uuid="8d020da9c5434489960da2631ebbc118">tempest-SecurityGroupsTestJSON-1614836843-project-member</nova:user>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:project uuid="578fc64b175945c785ac201f680d3471">tempest-SecurityGroupsTestJSON-1614836843</nova:project>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <nova:port uuid="5249aff9-7e40-4a33-ae01-3f575d4e623d">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <entry name="serial">24fa04e2-99c5-450d-9be4-a80e22fcb516</entry>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <entry name="uuid">24fa04e2-99c5-450d-9be4-a80e22fcb516</entry>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/24fa04e2-99c5-450d-9be4-a80e22fcb516_disk">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/24fa04e2-99c5-450d-9be4-a80e22fcb516_disk.config">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:38:d6:fe"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <target dev="tap5249aff9-7e"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/console.log" append="off"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:12:28 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:12:28 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:12:28 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:12:28 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.980 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Preparing to wait for external event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.981 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.981 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.981 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.982 227317 DEBUG nova.virt.libvirt.vif [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:12:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-775880326',display_name='tempest-SecurityGroupsTestJSON-server-775880326',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-775880326',id=12,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='578fc64b175945c785ac201f680d3471',ramdisk_id='',reservation_id='r-majveexy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1614836843',owner_user_name='tempest-SecurityGroupsTestJSON-1614836843-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:12:20Z,user_data=None,user_id='8d020da9c5434489960da2631ebbc118',uuid=24fa04e2-99c5-450d-9be4-a80e22fcb516,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.983 227317 DEBUG nova.network.os_vif_util [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Converting VIF {"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.983 227317 DEBUG nova.network.os_vif_util [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.984 227317 DEBUG os_vif [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.985 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.985 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.986 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.991 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.992 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5249aff9-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.993 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5249aff9-7e, col_values=(('external_ids', {'iface-id': '5249aff9-7e40-4a33-ae01-3f575d4e623d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:d6:fe', 'vm-uuid': '24fa04e2-99c5-450d-9be4-a80e22fcb516'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.996 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:28 np0005596062 nova_compute[227313]: 2026-01-26 18:12:28.998 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:12:28 np0005596062 NetworkManager[48993]: <info>  [1769451148.9983] manager: (tap5249aff9-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.007 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.009 227317 INFO os_vif [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e')#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.131 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.132 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.132 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] No VIF found with MAC fa:16:3e:38:d6:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.132 227317 INFO nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Using config drive#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.165 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:29.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:29.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.749 227317 DEBUG nova.network.neutron [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updated VIF entry in instance network info cache for port 5249aff9-7e40-4a33-ae01-3f575d4e623d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.750 227317 DEBUG nova.network.neutron [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updating instance_info_cache with network_info: [{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.800 227317 DEBUG oslo_concurrency.lockutils [req-a24d8ced-7fcb-4d7f-8196-7b0da092dc4a req-241c924f-6725-4583-8153-5f655846fb48 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.815 227317 INFO nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Creating config drive at /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/disk.config#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.820 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsnf33lrc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.959 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsnf33lrc" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.994 227317 DEBUG nova.storage.rbd_utils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] rbd image 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:12:29 np0005596062 nova_compute[227313]: 2026-01-26 18:12:29.999 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/disk.config 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.025 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.284 227317 DEBUG oslo_concurrency.processutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/disk.config 24fa04e2-99c5-450d-9be4-a80e22fcb516_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.285 227317 INFO nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Deleting local config drive /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516/disk.config because it was imported into RBD.#033[00m
Jan 26 13:12:30 np0005596062 kernel: tap5249aff9-7e: entered promiscuous mode
Jan 26 13:12:30 np0005596062 NetworkManager[48993]: <info>  [1769451150.3475] manager: (tap5249aff9-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 26 13:12:30 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:30Z|00089|binding|INFO|Claiming lport 5249aff9-7e40-4a33-ae01-3f575d4e623d for this chassis.
Jan 26 13:12:30 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:30Z|00090|binding|INFO|5249aff9-7e40-4a33-ae01-3f575d4e623d: Claiming fa:16:3e:38:d6:fe 10.100.0.14
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.349 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.372 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:d6:fe 10.100.0.14'], port_security=['fa:16:3e:38:d6:fe 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '24fa04e2-99c5-450d-9be4-a80e22fcb516', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '578fc64b175945c785ac201f680d3471', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89530ca9-a472-41e1-b750-a44b53f09291', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=311b5b12-2491-4714-bc5f-8af39981400c, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=5249aff9-7e40-4a33-ae01-3f575d4e623d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.373 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 5249aff9-7e40-4a33-ae01-3f575d4e623d in datapath 606327b0-bc8e-49e0-8a3f-009c1401e85f bound to our chassis#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.374 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 606327b0-bc8e-49e0-8a3f-009c1401e85f#033[00m
Jan 26 13:12:30 np0005596062 systemd-udevd[237580]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:12:30 np0005596062 systemd-machined[195380]: New machine qemu-9-instance-0000000c.
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.390 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f39b4652-369c-4809-bd7a-0e5e70af2e81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.391 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap606327b0-b1 in ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.394 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap606327b0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.394 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[757ffdff-5f75-440d-9878-34ca5d051d56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.395 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb72f8e-8a84-40aa-b787-c67a1b6da947]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 NetworkManager[48993]: <info>  [1769451150.4001] device (tap5249aff9-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:12:30 np0005596062 NetworkManager[48993]: <info>  [1769451150.4013] device (tap5249aff9-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.409 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[9615b149-b5c2-4cec-bed2-fb21ffbf3cc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.433 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.435 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0ede9f8a-115e-4406-be1c-b90eed3fbc3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:30Z|00091|binding|INFO|Setting lport 5249aff9-7e40-4a33-ae01-3f575d4e623d ovn-installed in OVS
Jan 26 13:12:30 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:30Z|00092|binding|INFO|Setting lport 5249aff9-7e40-4a33-ae01-3f575d4e623d up in Southbound
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.449 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.472 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[1daa5443-25c3-4eb3-8468-c54fa12409e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 NetworkManager[48993]: <info>  [1769451150.4801] manager: (tap606327b0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.480 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2bea36-ed62-4792-91ae-bec183a575b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.518 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[7694be9c-b334-413e-aa9f-5030977ee9a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.522 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[69d7d485-7696-4173-8cc2-9002779a7368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 NetworkManager[48993]: <info>  [1769451150.5469] device (tap606327b0-b0): carrier: link connected
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.554 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[853a5249-0158-47bb-9881-c11f2fd0a286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.574 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1e9170e3-6e04-4130-a1db-c08c764a710b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap606327b0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:57:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491059, 'reachable_time': 42379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237614, 'error': None, 'target': 'ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.593 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[3268f2a9-267f-413b-8551-9bfc71375bc7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:57c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 491059, 'tstamp': 491059}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237615, 'error': None, 'target': 'ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.613 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f1df315e-0efe-4d00-a26b-274c31fd68a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap606327b0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:57:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491059, 'reachable_time': 42379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237616, 'error': None, 'target': 'ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.652 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[900717cf-146e-451e-9f13-8d365b29ecd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.730 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a526100f-b0fe-4fb7-b013-54948bafd678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.732 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap606327b0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.732 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.733 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap606327b0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.734 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 NetworkManager[48993]: <info>  [1769451150.7357] manager: (tap606327b0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 26 13:12:30 np0005596062 kernel: tap606327b0-b0: entered promiscuous mode
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.738 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.739 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap606327b0-b0, col_values=(('external_ids', {'iface-id': '02272577-c2d6-4839-9227-163aea5b2336'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:30 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:30Z|00093|binding|INFO|Releasing lport 02272577-c2d6-4839-9227-163aea5b2336 from this chassis (sb_readonly=0)
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.740 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.741 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.743 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/606327b0-bc8e-49e0-8a3f-009c1401e85f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/606327b0-bc8e-49e0-8a3f-009c1401e85f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.744 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[68d7b263-a729-4946-85e6-0d78c6219d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.745 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-606327b0-bc8e-49e0-8a3f-009c1401e85f
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/606327b0-bc8e-49e0-8a3f-009c1401e85f.pid.haproxy
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 606327b0-bc8e-49e0-8a3f-009c1401e85f
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:12:30 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:30.746 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'env', 'PROCESS_TAG=haproxy-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/606327b0-bc8e-49e0-8a3f-009c1401e85f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.755 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.825 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451150.8242354, 24fa04e2-99c5-450d-9be4-a80e22fcb516 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.826 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] VM Started (Lifecycle Event)#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.867 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.874 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451150.8260312, 24fa04e2-99c5-450d-9be4-a80e22fcb516 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.875 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.900 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.905 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.953 227317 DEBUG nova.compute.manager [req-698bfd6c-445d-4d55-a3fb-d909689d609c req-e117b032-f099-499c-93bf-cec1cc003d7a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.954 227317 DEBUG oslo_concurrency.lockutils [req-698bfd6c-445d-4d55-a3fb-d909689d609c req-e117b032-f099-499c-93bf-cec1cc003d7a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.954 227317 DEBUG oslo_concurrency.lockutils [req-698bfd6c-445d-4d55-a3fb-d909689d609c req-e117b032-f099-499c-93bf-cec1cc003d7a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.955 227317 DEBUG oslo_concurrency.lockutils [req-698bfd6c-445d-4d55-a3fb-d909689d609c req-e117b032-f099-499c-93bf-cec1cc003d7a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.956 227317 DEBUG nova.compute.manager [req-698bfd6c-445d-4d55-a3fb-d909689d609c req-e117b032-f099-499c-93bf-cec1cc003d7a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Processing event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.956 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.957 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.962 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451150.961684, 24fa04e2-99c5-450d-9be4-a80e22fcb516 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.962 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.964 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.968 227317 INFO nova.virt.libvirt.driver [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Instance spawned successfully.#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.968 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.986 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.992 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.996 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.997 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.997 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.998 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.998 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:12:30 np0005596062 nova_compute[227313]: 2026-01-26 18:12:30.999 227317 DEBUG nova.virt.libvirt.driver [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.045 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.084 227317 INFO nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Took 9.91 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.085 227317 DEBUG nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:12:31 np0005596062 podman[237689]: 2026-01-26 18:12:31.195225377 +0000 UTC m=+0.055334566 container create 30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.214 227317 INFO nova.compute.manager [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Took 12.44 seconds to build instance.#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.235 227317 DEBUG oslo_concurrency.lockutils [None req-0189af34-1854-40a6-9bd7-d21d974a06cd 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.236 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 7.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.236 227317 INFO nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:12:31 np0005596062 nova_compute[227313]: 2026-01-26 18:12:31.236 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:31 np0005596062 systemd[1]: Started libpod-conmon-30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92.scope.
Jan 26 13:12:31 np0005596062 podman[237689]: 2026-01-26 18:12:31.168435113 +0000 UTC m=+0.028544322 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:12:31 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:12:31 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a1c03d39e7c06fee31cc0387873247320f00ced3dd9f035656b7d797bb5cc60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:12:31 np0005596062 podman[237689]: 2026-01-26 18:12:31.296947627 +0000 UTC m=+0.157056856 container init 30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:12:31 np0005596062 podman[237689]: 2026-01-26 18:12:31.304536479 +0000 UTC m=+0.164645668 container start 30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 26 13:12:31 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [NOTICE]   (237708) : New worker (237710) forked
Jan 26 13:12:31 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [NOTICE]   (237708) : Loading success.
Jan 26 13:12:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:31.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:31.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.142 227317 DEBUG nova.compute.manager [req-0263f626-15ec-42f1-b42e-abc1f63a36d8 req-4c5ba143-78aa-4005-af36-e0e8f23a3169 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.143 227317 DEBUG oslo_concurrency.lockutils [req-0263f626-15ec-42f1-b42e-abc1f63a36d8 req-4c5ba143-78aa-4005-af36-e0e8f23a3169 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.143 227317 DEBUG oslo_concurrency.lockutils [req-0263f626-15ec-42f1-b42e-abc1f63a36d8 req-4c5ba143-78aa-4005-af36-e0e8f23a3169 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.144 227317 DEBUG oslo_concurrency.lockutils [req-0263f626-15ec-42f1-b42e-abc1f63a36d8 req-4c5ba143-78aa-4005-af36-e0e8f23a3169 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.144 227317 DEBUG nova.compute.manager [req-0263f626-15ec-42f1-b42e-abc1f63a36d8 req-4c5ba143-78aa-4005-af36-e0e8f23a3169 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] No waiting events found dispatching network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.144 227317 WARNING nova.compute.manager [req-0263f626-15ec-42f1-b42e-abc1f63a36d8 req-4c5ba143-78aa-4005-af36-e0e8f23a3169 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received unexpected event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d for instance with vm_state active and task_state None.#033[00m
Jan 26 13:12:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:33.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:33.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:33 np0005596062 nova_compute[227313]: 2026-01-26 18:12:33.996 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:34 np0005596062 podman[237721]: 2026-01-26 18:12:34.859116775 +0000 UTC m=+0.064064269 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 26 13:12:35 np0005596062 nova_compute[227313]: 2026-01-26 18:12:35.010 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:35.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:35.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.340 227317 DEBUG nova.compute.manager [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-changed-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.340 227317 DEBUG nova.compute.manager [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Refreshing instance network info cache due to event network-changed-5249aff9-7e40-4a33-ae01-3f575d4e623d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.340 227317 DEBUG oslo_concurrency.lockutils [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.340 227317 DEBUG oslo_concurrency.lockutils [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.341 227317 DEBUG nova.network.neutron [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Refreshing network info cache for port 5249aff9-7e40-4a33-ae01-3f575d4e623d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.596 227317 DEBUG nova.compute.manager [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-changed-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.597 227317 DEBUG nova.compute.manager [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Refreshing instance network info cache due to event network-changed-5249aff9-7e40-4a33-ae01-3f575d4e623d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:12:36 np0005596062 nova_compute[227313]: 2026-01-26 18:12:36.597 227317 DEBUG oslo_concurrency.lockutils [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:12:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:37.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:37.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:37 np0005596062 nova_compute[227313]: 2026-01-26 18:12:37.876 227317 DEBUG nova.network.neutron [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updated VIF entry in instance network info cache for port 5249aff9-7e40-4a33-ae01-3f575d4e623d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:12:37 np0005596062 nova_compute[227313]: 2026-01-26 18:12:37.877 227317 DEBUG nova.network.neutron [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updating instance_info_cache with network_info: [{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:12:37 np0005596062 nova_compute[227313]: 2026-01-26 18:12:37.917 227317 DEBUG oslo_concurrency.lockutils [req-9d17022e-536f-484a-8dcd-962b27245a19 req-553d4f85-5361-48d0-987d-4b8eace3e603 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:12:37 np0005596062 nova_compute[227313]: 2026-01-26 18:12:37.918 227317 DEBUG oslo_concurrency.lockutils [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:12:37 np0005596062 nova_compute[227313]: 2026-01-26 18:12:37.918 227317 DEBUG nova.network.neutron [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Refreshing network info cache for port 5249aff9-7e40-4a33-ae01-3f575d4e623d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:12:38 np0005596062 nova_compute[227313]: 2026-01-26 18:12:38.999 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:39.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:39.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:40 np0005596062 nova_compute[227313]: 2026-01-26 18:12:40.011 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:40 np0005596062 nova_compute[227313]: 2026-01-26 18:12:40.063 227317 DEBUG nova.network.neutron [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updated VIF entry in instance network info cache for port 5249aff9-7e40-4a33-ae01-3f575d4e623d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:12:40 np0005596062 nova_compute[227313]: 2026-01-26 18:12:40.064 227317 DEBUG nova.network.neutron [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updating instance_info_cache with network_info: [{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:12:40 np0005596062 nova_compute[227313]: 2026-01-26 18:12:40.081 227317 DEBUG oslo_concurrency.lockutils [req-34c5f32e-8f3f-4efd-8907-077be630217c req-81f6e6cc-3421-4c25-917e-26a5c1bb23e6 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:12:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:12:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/633037466' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:12:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:12:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/633037466' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:12:41 np0005596062 nova_compute[227313]: 2026-01-26 18:12:41.050 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:41.050 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:12:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:41.054 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:12:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:41.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:41.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:42 np0005596062 podman[237748]: 2026-01-26 18:12:42.939587436 +0000 UTC m=+0.143731341 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 13:12:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:12:43.057 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:12:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:43.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:43.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:44 np0005596062 nova_compute[227313]: 2026-01-26 18:12:44.038 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:45 np0005596062 nova_compute[227313]: 2026-01-26 18:12:45.014 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:45.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:45.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:46 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:46Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:d6:fe 10.100.0.14
Jan 26 13:12:46 np0005596062 ovn_controller[133984]: 2026-01-26T18:12:46Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:d6:fe 10.100.0.14
Jan 26 13:12:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:47.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:47.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 26 13:12:49 np0005596062 nova_compute[227313]: 2026-01-26 18:12:49.041 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:49.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:49.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:50 np0005596062 nova_compute[227313]: 2026-01-26 18:12:50.035 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.516948962 +0000 UTC m=+0.053456046 container create 740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 26 13:12:50 np0005596062 systemd[1]: Started libpod-conmon-740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4.scope.
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.491772011 +0000 UTC m=+0.028279095 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 13:12:50 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.641584943 +0000 UTC m=+0.178092027 container init 740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_beaver, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.649580906 +0000 UTC m=+0.186087960 container start 740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_beaver, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.653177942 +0000 UTC m=+0.189685026 container attach 740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_beaver, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef)
Jan 26 13:12:50 np0005596062 musing_beaver[238118]: 167 167
Jan 26 13:12:50 np0005596062 systemd[1]: libpod-740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4.scope: Deactivated successfully.
Jan 26 13:12:50 np0005596062 conmon[238118]: conmon 740506cde8ffb2c80f3e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4.scope/container/memory.events
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.658561085 +0000 UTC m=+0.195068149 container died 740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 26 13:12:50 np0005596062 systemd[1]: var-lib-containers-storage-overlay-b323e0dc1a02b5abf88ecbab4016af300f90f4c3f8d7c3a971777fc08f962775-merged.mount: Deactivated successfully.
Jan 26 13:12:50 np0005596062 podman[238101]: 2026-01-26 18:12:50.929126284 +0000 UTC m=+0.465633378 container remove 740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_beaver, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 13:12:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:50 np0005596062 systemd[1]: libpod-conmon-740506cde8ffb2c80f3ee1106cca522c80d287178152e5549d592cb26b9199f4.scope: Deactivated successfully.
Jan 26 13:12:51 np0005596062 podman[238142]: 2026-01-26 18:12:51.183239865 +0000 UTC m=+0.098669990 container create 46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_fermat, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 13:12:51 np0005596062 podman[238142]: 2026-01-26 18:12:51.115157081 +0000 UTC m=+0.030587226 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 13:12:51 np0005596062 systemd[1]: Started libpod-conmon-46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef.scope.
Jan 26 13:12:51 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:12:51 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6f89bc777f2a4262dc92e61947b916c98411f5eefa2a2e895f8930620298d7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 13:12:51 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6f89bc777f2a4262dc92e61947b916c98411f5eefa2a2e895f8930620298d7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 13:12:51 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6f89bc777f2a4262dc92e61947b916c98411f5eefa2a2e895f8930620298d7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 13:12:51 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad6f89bc777f2a4262dc92e61947b916c98411f5eefa2a2e895f8930620298d7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 13:12:51 np0005596062 podman[238142]: 2026-01-26 18:12:51.485361485 +0000 UTC m=+0.400791630 container init 46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_fermat, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 26 13:12:51 np0005596062 podman[238142]: 2026-01-26 18:12:51.497882719 +0000 UTC m=+0.413312834 container start 46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_fermat, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:12:51 np0005596062 podman[238142]: 2026-01-26 18:12:51.501647719 +0000 UTC m=+0.417077834 container attach 46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_fermat, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 26 13:12:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:51.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:12:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:12:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]: [
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:    {
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "available": false,
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "ceph_device": false,
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "lsm_data": {},
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "lvs": [],
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "path": "/dev/sr0",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "rejected_reasons": [
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "Has a FileSystem",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "Insufficient space (<5GB)"
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        ],
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        "sys_api": {
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "actuators": null,
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "device_nodes": "sr0",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "devname": "sr0",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "human_readable_size": "482.00 KB",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "id_bus": "ata",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "model": "QEMU DVD-ROM",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "nr_requests": "2",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "parent": "/dev/sr0",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "partitions": {},
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "path": "/dev/sr0",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "removable": "1",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "rev": "2.5+",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "ro": "0",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "rotational": "1",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "sas_address": "",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "sas_device_handle": "",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "scheduler_mode": "mq-deadline",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "sectors": 0,
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "sectorsize": "2048",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "size": 493568.0,
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "support_discard": "2048",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "type": "disk",
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:            "vendor": "QEMU"
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:        }
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]:    }
Jan 26 13:12:52 np0005596062 relaxed_fermat[238158]: ]
Jan 26 13:12:52 np0005596062 systemd[1]: libpod-46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef.scope: Deactivated successfully.
Jan 26 13:12:52 np0005596062 podman[238142]: 2026-01-26 18:12:52.899580949 +0000 UTC m=+1.815011124 container died 46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_fermat, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 13:12:52 np0005596062 systemd[1]: libpod-46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef.scope: Consumed 1.395s CPU time.
Jan 26 13:12:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:53.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:12:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:53.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:12:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 26 13:12:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 26 13:12:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 26 13:12:53 np0005596062 systemd[1]: var-lib-containers-storage-overlay-ad6f89bc777f2a4262dc92e61947b916c98411f5eefa2a2e895f8930620298d7-merged.mount: Deactivated successfully.
Jan 26 13:12:53 np0005596062 podman[238142]: 2026-01-26 18:12:53.965994255 +0000 UTC m=+2.881424370 container remove 46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_fermat, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 13:12:53 np0005596062 systemd[1]: libpod-conmon-46fe1d11c681bc7e5fe7f1a9e1a4cf0666d5c9b67ca2a5799a75ed0d25e914ef.scope: Deactivated successfully.
Jan 26 13:12:54 np0005596062 nova_compute[227313]: 2026-01-26 18:12:54.047 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:54 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 26 13:12:54 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 26 13:12:55 np0005596062 nova_compute[227313]: 2026-01-26 18:12:55.039 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:55 np0005596062 nova_compute[227313]: 2026-01-26 18:12:55.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:55 np0005596062 nova_compute[227313]: 2026-01-26 18:12:55.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:12:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:55.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:55.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:12:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:12:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:12:56 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:12:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.324 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.324 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.324 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.325 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:12:57 np0005596062 nova_compute[227313]: 2026-01-26 18:12:57.325 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:57.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:57.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:12:57 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4026088523' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.014 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.689s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.743 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.744 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.920 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.922 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4643MB free_disk=20.94292449951172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.922 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:12:58 np0005596062 nova_compute[227313]: 2026-01-26 18:12:58.922 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.050 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.258 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 24fa04e2-99c5-450d-9be4-a80e22fcb516 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.259 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.259 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.301 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:12:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 26 13:12:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:12:59.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 26 13:12:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:12:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:12:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:12:59.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:12:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:12:59 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2930151516' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.775 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.783 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.803 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.832 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:12:59 np0005596062 nova_compute[227313]: 2026-01-26 18:12:59.832 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:00 np0005596062 nova_compute[227313]: 2026-01-26 18:13:00.042 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:00 np0005596062 nova_compute[227313]: 2026-01-26 18:13:00.832 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:00 np0005596062 nova_compute[227313]: 2026-01-26 18:13:00.833 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:00 np0005596062 nova_compute[227313]: 2026-01-26 18:13:00.833 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:13:00 np0005596062 nova_compute[227313]: 2026-01-26 18:13:00.834 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:13:01 np0005596062 nova_compute[227313]: 2026-01-26 18:13:01.069 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:13:01 np0005596062 nova_compute[227313]: 2026-01-26 18:13:01.070 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:13:01 np0005596062 nova_compute[227313]: 2026-01-26 18:13:01.070 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:13:01 np0005596062 nova_compute[227313]: 2026-01-26 18:13:01.070 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 24fa04e2-99c5-450d-9be4-a80e22fcb516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:13:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:01.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:01.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.050 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updating instance_info_cache with network_info: [{"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.070 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-24fa04e2-99c5-450d-9be4-a80e22fcb516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.070 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:03 np0005596062 nova_compute[227313]: 2026-01-26 18:13:03.071 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:03.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:03.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.053 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.688 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.689 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.689 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.689 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.689 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.691 227317 INFO nova.compute.manager [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Terminating instance#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.692 227317 DEBUG nova.compute.manager [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:13:04 np0005596062 kernel: tap5249aff9-7e (unregistering): left promiscuous mode
Jan 26 13:13:04 np0005596062 NetworkManager[48993]: <info>  [1769451184.7613] device (tap5249aff9-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.777 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 ovn_controller[133984]: 2026-01-26T18:13:04Z|00094|binding|INFO|Releasing lport 5249aff9-7e40-4a33-ae01-3f575d4e623d from this chassis (sb_readonly=0)
Jan 26 13:13:04 np0005596062 ovn_controller[133984]: 2026-01-26T18:13:04Z|00095|binding|INFO|Setting lport 5249aff9-7e40-4a33-ae01-3f575d4e623d down in Southbound
Jan 26 13:13:04 np0005596062 ovn_controller[133984]: 2026-01-26T18:13:04Z|00096|binding|INFO|Removing iface tap5249aff9-7e ovn-installed in OVS
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.780 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:04.787 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:d6:fe 10.100.0.14'], port_security=['fa:16:3e:38:d6:fe 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '24fa04e2-99c5-450d-9be4-a80e22fcb516', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '578fc64b175945c785ac201f680d3471', 'neutron:revision_number': '6', 'neutron:security_group_ids': '89530ca9-a472-41e1-b750-a44b53f09291 aa75f910-db3a-49a0-aacf-da5d4f89d16e bf8c876a-9125-4c84-888b-f8efa39b24be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=311b5b12-2491-4714-bc5f-8af39981400c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=5249aff9-7e40-4a33-ae01-3f575d4e623d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:13:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:04.788 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 5249aff9-7e40-4a33-ae01-3f575d4e623d in datapath 606327b0-bc8e-49e0-8a3f-009c1401e85f unbound from our chassis#033[00m
Jan 26 13:13:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:04.789 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 606327b0-bc8e-49e0-8a3f-009c1401e85f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:13:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:04.791 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e7da1eb5-e5cc-42dd-b309-7c8c2f411f84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:04.791 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f namespace which is not needed anymore#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.866 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 26 13:13:04 np0005596062 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 15.493s CPU time.
Jan 26 13:13:04 np0005596062 systemd-machined[195380]: Machine qemu-9-instance-0000000c terminated.
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.916 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.924 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.934 227317 INFO nova.virt.libvirt.driver [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Instance destroyed successfully.#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.935 227317 DEBUG nova.objects.instance [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lazy-loading 'resources' on Instance uuid 24fa04e2-99c5-450d-9be4-a80e22fcb516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.950 227317 DEBUG nova.virt.libvirt.vif [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:12:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-775880326',display_name='tempest-SecurityGroupsTestJSON-server-775880326',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-775880326',id=12,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:12:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='578fc64b175945c785ac201f680d3471',ramdisk_id='',reservation_id='r-majveexy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1614836843',owner_user_name='tempest-SecurityGroupsTestJSON-1614836843-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:12:31Z,user_data=None,user_id='8d020da9c5434489960da2631ebbc118',uuid=24fa04e2-99c5-450d-9be4-a80e22fcb516,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.951 227317 DEBUG nova.network.os_vif_util [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Converting VIF {"id": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "address": "fa:16:3e:38:d6:fe", "network": {"id": "606327b0-bc8e-49e0-8a3f-009c1401e85f", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-722782129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "578fc64b175945c785ac201f680d3471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5249aff9-7e", "ovs_interfaceid": "5249aff9-7e40-4a33-ae01-3f575d4e623d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.952 227317 DEBUG nova.network.os_vif_util [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.952 227317 DEBUG os_vif [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.955 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.956 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5249aff9-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.958 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.962 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:13:04 np0005596062 nova_compute[227313]: 2026-01-26 18:13:04.965 227317 INFO os_vif [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:d6:fe,bridge_name='br-int',has_traffic_filtering=True,id=5249aff9-7e40-4a33-ae01-3f575d4e623d,network=Network(606327b0-bc8e-49e0-8a3f-009c1401e85f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5249aff9-7e')#033[00m
Jan 26 13:13:05 np0005596062 podman[239617]: 2026-01-26 18:13:05.002242488 +0000 UTC m=+0.099304897 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 13:13:05 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [NOTICE]   (237708) : haproxy version is 2.8.14-c23fe91
Jan 26 13:13:05 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [NOTICE]   (237708) : path to executable is /usr/sbin/haproxy
Jan 26 13:13:05 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [WARNING]  (237708) : Exiting Master process...
Jan 26 13:13:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:13:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:13:05 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [ALERT]    (237708) : Current worker (237710) exited with code 143 (Terminated)
Jan 26 13:13:05 np0005596062 neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f[237704]: [WARNING]  (237708) : All workers exited. Exiting... (0)
Jan 26 13:13:05 np0005596062 systemd[1]: libpod-30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92.scope: Deactivated successfully.
Jan 26 13:13:05 np0005596062 podman[239661]: 2026-01-26 18:13:05.026644118 +0000 UTC m=+0.056445225 container died 30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.044 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:05 np0005596062 systemd[1]: var-lib-containers-storage-overlay-5a1c03d39e7c06fee31cc0387873247320f00ced3dd9f035656b7d797bb5cc60-merged.mount: Deactivated successfully.
Jan 26 13:13:05 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92-userdata-shm.mount: Deactivated successfully.
Jan 26 13:13:05 np0005596062 podman[239661]: 2026-01-26 18:13:05.070223499 +0000 UTC m=+0.100024596 container cleanup 30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:13:05 np0005596062 systemd[1]: libpod-conmon-30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92.scope: Deactivated successfully.
Jan 26 13:13:05 np0005596062 podman[239709]: 2026-01-26 18:13:05.149821549 +0000 UTC m=+0.051216306 container remove 30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.157 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8fa0e7-b14a-498c-aa97-aed6be29d19b]: (4, ('Mon Jan 26 06:13:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f (30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92)\n30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92\nMon Jan 26 06:13:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f (30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92)\n30d1ca274b0b3820899366ea18f84008e12a9fe240af0373e894ff8d3f5a7d92\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.159 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[8dbc2b83-30a3-4cc3-b949-51e045a2027d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.160 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap606327b0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.162 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:05 np0005596062 kernel: tap606327b0-b0: left promiscuous mode
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.177 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.178 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.182 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f58a2df9-9d2a-4507-9871-b24fc465a642]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.204 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c77a2bad-595c-4493-b643-65f14e963222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.207 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e95661e7-0552-4e1c-97fa-ec5009e1e159]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.230 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd281d0-dc93-4341-87c5-ddd77836354b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491051, 'reachable_time': 19107, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239725, 'error': None, 'target': 'ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.233 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-606327b0-bc8e-49e0-8a3f-009c1401e85f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:13:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:05.234 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[33b030b3-6244-4723-8896-f1bda76541e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:13:05 np0005596062 systemd[1]: run-netns-ovnmeta\x2d606327b0\x2dbc8e\x2d49e0\x2d8a3f\x2d009c1401e85f.mount: Deactivated successfully.
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.237 227317 DEBUG nova.compute.manager [req-b28d6afa-5b06-490c-a75a-8d0ae19762f5 req-3e4f76db-1b11-4b7b-bd62-7525216e133f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-vif-unplugged-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.238 227317 DEBUG oslo_concurrency.lockutils [req-b28d6afa-5b06-490c-a75a-8d0ae19762f5 req-3e4f76db-1b11-4b7b-bd62-7525216e133f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.238 227317 DEBUG oslo_concurrency.lockutils [req-b28d6afa-5b06-490c-a75a-8d0ae19762f5 req-3e4f76db-1b11-4b7b-bd62-7525216e133f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.238 227317 DEBUG oslo_concurrency.lockutils [req-b28d6afa-5b06-490c-a75a-8d0ae19762f5 req-3e4f76db-1b11-4b7b-bd62-7525216e133f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.238 227317 DEBUG nova.compute.manager [req-b28d6afa-5b06-490c-a75a-8d0ae19762f5 req-3e4f76db-1b11-4b7b-bd62-7525216e133f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] No waiting events found dispatching network-vif-unplugged-5249aff9-7e40-4a33-ae01-3f575d4e623d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:13:05 np0005596062 nova_compute[227313]: 2026-01-26 18:13:05.238 227317 DEBUG nova.compute.manager [req-b28d6afa-5b06-490c-a75a-8d0ae19762f5 req-3e4f76db-1b11-4b7b-bd62-7525216e133f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-vif-unplugged-5249aff9-7e40-4a33-ae01-3f575d4e623d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:13:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:05.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:05.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:06 np0005596062 nova_compute[227313]: 2026-01-26 18:13:06.043 227317 INFO nova.virt.libvirt.driver [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Deleting instance files /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516_del#033[00m
Jan 26 13:13:06 np0005596062 nova_compute[227313]: 2026-01-26 18:13:06.044 227317 INFO nova.virt.libvirt.driver [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Deletion of /var/lib/nova/instances/24fa04e2-99c5-450d-9be4-a80e22fcb516_del complete#033[00m
Jan 26 13:13:06 np0005596062 nova_compute[227313]: 2026-01-26 18:13:06.094 227317 INFO nova.compute.manager [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Took 1.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:13:06 np0005596062 nova_compute[227313]: 2026-01-26 18:13:06.095 227317 DEBUG oslo.service.loopingcall [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:13:06 np0005596062 nova_compute[227313]: 2026-01-26 18:13:06.095 227317 DEBUG nova.compute.manager [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:13:06 np0005596062 nova_compute[227313]: 2026-01-26 18:13:06.095 227317 DEBUG nova.network.neutron [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:13:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.062 227317 DEBUG nova.network.neutron [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.079 227317 INFO nova.compute.manager [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Took 0.98 seconds to deallocate network for instance.#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.129 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.130 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.175 227317 DEBUG oslo_concurrency.processutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.348 227317 DEBUG nova.compute.manager [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.349 227317 DEBUG oslo_concurrency.lockutils [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.350 227317 DEBUG oslo_concurrency.lockutils [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.350 227317 DEBUG oslo_concurrency.lockutils [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.351 227317 DEBUG nova.compute.manager [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] No waiting events found dispatching network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.351 227317 WARNING nova.compute.manager [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received unexpected event network-vif-plugged-5249aff9-7e40-4a33-ae01-3f575d4e623d for instance with vm_state deleted and task_state None.#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.351 227317 DEBUG nova.compute.manager [req-4fd5c58d-7efe-4afa-b70c-5b7cac546d88 req-eb50251a-09d4-4d42-9be5-f31f985cd6d7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Received event network-vif-deleted-5249aff9-7e40-4a33-ae01-3f575d4e623d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:13:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:07.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:13:07 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2237996741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:13:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.686 227317 DEBUG oslo_concurrency.processutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.694 227317 DEBUG nova.compute.provider_tree [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:13:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:07.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.742 227317 DEBUG nova.scheduler.client.report [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.824 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.853 227317 INFO nova.scheduler.client.report [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Deleted allocations for instance 24fa04e2-99c5-450d-9be4-a80e22fcb516#033[00m
Jan 26 13:13:07 np0005596062 nova_compute[227313]: 2026-01-26 18:13:07.928 227317 DEBUG oslo_concurrency.lockutils [None req-79d75717-2e10-4ad5-b859-bcd41c241740 8d020da9c5434489960da2631ebbc118 578fc64b175945c785ac201f680d3471 - - default default] Lock "24fa04e2-99c5-450d-9be4-a80e22fcb516" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:09.161 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:09.163 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:09.163 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:09.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:09.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:09 np0005596062 nova_compute[227313]: 2026-01-26 18:13:09.960 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:10 np0005596062 nova_compute[227313]: 2026-01-26 18:13:10.046 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:11.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:11.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:13.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:13.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:13 np0005596062 podman[239803]: 2026-01-26 18:13:13.874280751 +0000 UTC m=+0.080407413 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:13:14 np0005596062 nova_compute[227313]: 2026-01-26 18:13:14.964 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:15 np0005596062 nova_compute[227313]: 2026-01-26 18:13:15.048 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:15.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 26 13:13:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:15.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:17.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:17.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:18 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:18.394 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:13:18 np0005596062 nova_compute[227313]: 2026-01-26 18:13:18.395 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:18 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:18.396 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:13:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:19.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:19.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:19 np0005596062 nova_compute[227313]: 2026-01-26 18:13:19.933 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451184.9314818, 24fa04e2-99c5-450d-9be4-a80e22fcb516 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:13:19 np0005596062 nova_compute[227313]: 2026-01-26 18:13:19.934 227317 INFO nova.compute.manager [-] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:13:19 np0005596062 nova_compute[227313]: 2026-01-26 18:13:19.965 227317 DEBUG nova.compute.manager [None req-1363c566-018d-4b02-a24f-2f6e3c25ddbd - - - - - -] [instance: 24fa04e2-99c5-450d-9be4-a80e22fcb516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:13:20 np0005596062 nova_compute[227313]: 2026-01-26 18:13:20.002 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:20 np0005596062 nova_compute[227313]: 2026-01-26 18:13:20.051 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:21.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:21.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:22.398 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:13:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:23.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:23.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:23 np0005596062 nova_compute[227313]: 2026-01-26 18:13:23.956 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:25 np0005596062 nova_compute[227313]: 2026-01-26 18:13:25.005 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:25 np0005596062 nova_compute[227313]: 2026-01-26 18:13:25.053 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:25.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:25.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:27.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:27.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:29.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:29.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:30 np0005596062 nova_compute[227313]: 2026-01-26 18:13:30.008 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:30 np0005596062 nova_compute[227313]: 2026-01-26 18:13:30.058 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:31.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:31.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:33.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:33.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:35 np0005596062 nova_compute[227313]: 2026-01-26 18:13:35.012 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:35 np0005596062 nova_compute[227313]: 2026-01-26 18:13:35.061 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:35.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:35 np0005596062 podman[239889]: 2026-01-26 18:13:35.863208766 +0000 UTC m=+0.067389586 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:13:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:37.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:39.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:40 np0005596062 nova_compute[227313]: 2026-01-26 18:13:40.016 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:40 np0005596062 nova_compute[227313]: 2026-01-26 18:13:40.063 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:41.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:41.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:43.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:43.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:44 np0005596062 podman[239915]: 2026-01-26 18:13:44.919081827 +0000 UTC m=+0.111984265 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 13:13:45 np0005596062 nova_compute[227313]: 2026-01-26 18:13:45.018 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:45 np0005596062 nova_compute[227313]: 2026-01-26 18:13:45.065 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:45.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:45.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:47.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:48.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:49.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:50 np0005596062 nova_compute[227313]: 2026-01-26 18:13:50.023 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:50 np0005596062 nova_compute[227313]: 2026-01-26 18:13:50.067 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:50.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:51.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:52.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:53.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:13:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:54.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:13:55 np0005596062 nova_compute[227313]: 2026-01-26 18:13:55.027 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:55 np0005596062 nova_compute[227313]: 2026-01-26 18:13:55.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:55 np0005596062 nova_compute[227313]: 2026-01-26 18:13:55.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:13:55 np0005596062 nova_compute[227313]: 2026-01-26 18:13:55.070 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:55.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:56.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:13:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:57.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:13:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:13:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:13:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.135 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.136 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.136 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.136 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:13:58 np0005596062 nova_compute[227313]: 2026-01-26 18:13:58.136 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:13:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:13:59 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/244870610' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.219 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.424 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.425 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4861MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.425 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.426 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.560 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.561 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.579 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:13:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:59.581 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:13:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:13:59.583 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:13:59 np0005596062 nova_compute[227313]: 2026-01-26 18:13:59.602 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:13:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:13:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:13:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:13:59.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:14:00 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3734592559' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.029 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.072 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:00.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.247 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.668s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.253 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.310 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.537 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:14:00 np0005596062 nova_compute[227313]: 2026-01-26 18:14:00.537 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:01 np0005596062 nova_compute[227313]: 2026-01-26 18:14:01.538 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:01 np0005596062 nova_compute[227313]: 2026-01-26 18:14:01.538 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:01 np0005596062 nova_compute[227313]: 2026-01-26 18:14:01.539 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:01.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:02 np0005596062 nova_compute[227313]: 2026-01-26 18:14:02.045 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:02 np0005596062 nova_compute[227313]: 2026-01-26 18:14:02.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:02 np0005596062 nova_compute[227313]: 2026-01-26 18:14:02.049 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:14:02 np0005596062 nova_compute[227313]: 2026-01-26 18:14:02.049 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:14:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:02.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:02 np0005596062 nova_compute[227313]: 2026-01-26 18:14:02.146 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:14:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:03 np0005596062 nova_compute[227313]: 2026-01-26 18:14:03.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:03.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:04 np0005596062 nova_compute[227313]: 2026-01-26 18:14:04.047 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:04.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:04 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:04Z|00097|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 26 13:14:05 np0005596062 nova_compute[227313]: 2026-01-26 18:14:05.032 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:05 np0005596062 nova_compute[227313]: 2026-01-26 18:14:05.075 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:05.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:05 np0005596062 podman[240265]: 2026-01-26 18:14:05.96865241 +0000 UTC m=+0.058611485 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 13:14:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:06.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:06.585 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:14:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 13:14:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:14:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 13:14:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:14:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:07.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:14:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:14:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:08.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:09.163 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:09.164 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:09.164 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:09.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:10 np0005596062 nova_compute[227313]: 2026-01-26 18:14:10.037 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:10 np0005596062 nova_compute[227313]: 2026-01-26 18:14:10.077 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:10.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:11.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:12.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:13.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:14.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:15 np0005596062 nova_compute[227313]: 2026-01-26 18:14:15.041 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:15 np0005596062 nova_compute[227313]: 2026-01-26 18:14:15.080 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:15.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:15 np0005596062 podman[240395]: 2026-01-26 18:14:15.844208166 +0000 UTC m=+0.077530508 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:14:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:16.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:14:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:14:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:17.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:18.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:19.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:20 np0005596062 nova_compute[227313]: 2026-01-26 18:14:20.045 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:20 np0005596062 nova_compute[227313]: 2026-01-26 18:14:20.082 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:20.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:20 np0005596062 nova_compute[227313]: 2026-01-26 18:14:20.909 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:20 np0005596062 nova_compute[227313]: 2026-01-26 18:14:20.910 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:20 np0005596062 nova_compute[227313]: 2026-01-26 18:14:20.932 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:14:21 np0005596062 nova_compute[227313]: 2026-01-26 18:14:21.057 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:21 np0005596062 nova_compute[227313]: 2026-01-26 18:14:21.058 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:21 np0005596062 nova_compute[227313]: 2026-01-26 18:14:21.070 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:14:21 np0005596062 nova_compute[227313]: 2026-01-26 18:14:21.070 227317 INFO nova.compute.claims [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:14:21 np0005596062 nova_compute[227313]: 2026-01-26 18:14:21.622 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:21.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:14:22 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3106751130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:14:22 np0005596062 nova_compute[227313]: 2026-01-26 18:14:22.127 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:22 np0005596062 nova_compute[227313]: 2026-01-26 18:14:22.136 227317 DEBUG nova.compute.provider_tree [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:14:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:22.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:22 np0005596062 nova_compute[227313]: 2026-01-26 18:14:22.359 227317 DEBUG nova.scheduler.client.report [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:14:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.407 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.407 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.538 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.538 227317 DEBUG nova.network.neutron [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.578 227317 INFO nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:14:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:23.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.752 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.888 227317 DEBUG nova.policy [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '756b3c236ab34471af9186439bd20de5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26bc1062b076453599f5be48e7cb8915', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.956 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.958 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.959 227317 INFO nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Creating image(s)#033[00m
Jan 26 13:14:23 np0005596062 nova_compute[227313]: 2026-01-26 18:14:23.995 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.032 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.066 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.072 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.139 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.141 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.141 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.142 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:24.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.175 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.181 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 36552a60-fe1c-495f-bc2d-779bbd623626_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.816 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 36552a60-fe1c-495f-bc2d-779bbd623626_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.635s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:24 np0005596062 nova_compute[227313]: 2026-01-26 18:14:24.988 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] resizing rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.048 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.085 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.237 227317 DEBUG nova.objects.instance [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lazy-loading 'migration_context' on Instance uuid 36552a60-fe1c-495f-bc2d-779bbd623626 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.257 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.258 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Ensure instance console log exists: /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.259 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.259 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.259 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:25 np0005596062 nova_compute[227313]: 2026-01-26 18:14:25.306 227317 DEBUG nova.network.neutron [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Successfully created port: 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:14:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:25.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:26.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:26 np0005596062 nova_compute[227313]: 2026-01-26 18:14:26.811 227317 DEBUG nova.network.neutron [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Successfully updated port: 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.354 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.354 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquired lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.355 227317 DEBUG nova.network.neutron [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.390 227317 DEBUG nova.compute.manager [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-changed-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.391 227317 DEBUG nova.compute.manager [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Refreshing instance network info cache due to event network-changed-160d3cc7-cc03-492b-9c27-d7ed9d8654a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.391 227317 DEBUG oslo_concurrency.lockutils [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:14:27 np0005596062 nova_compute[227313]: 2026-01-26 18:14:27.627 227317 DEBUG nova.network.neutron [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:14:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:27.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:28.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:29 np0005596062 nova_compute[227313]: 2026-01-26 18:14:29.145 227317 DEBUG nova.network.neutron [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updating instance_info_cache with network_info: [{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:14:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:29.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.052 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.088 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.156 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Releasing lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.156 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Instance network_info: |[{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.157 227317 DEBUG oslo_concurrency.lockutils [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.157 227317 DEBUG nova.network.neutron [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Refreshing network info cache for port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.161 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Start _get_guest_xml network_info=[{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.165 227317 WARNING nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:14:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:30.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.173 227317 DEBUG nova.virt.libvirt.host [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.174 227317 DEBUG nova.virt.libvirt.host [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.179 227317 DEBUG nova.virt.libvirt.host [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.180 227317 DEBUG nova.virt.libvirt.host [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.182 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.182 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.182 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.183 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.183 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.183 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.183 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.183 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.184 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.184 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.184 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.184 227317 DEBUG nova.virt.hardware [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.187 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:14:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2934930464' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.675 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.700 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:30 np0005596062 nova_compute[227313]: 2026-01-26 18:14:30.705 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:14:31 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/925238128' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.123 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.125 227317 DEBUG nova.virt.libvirt.vif [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:14:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-987586838',display_name='tempest-FloatingIPsAssociationTestJSON-server-987586838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-987586838',id=13,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26bc1062b076453599f5be48e7cb8915',ramdisk_id='',reservation_id='r-1l1mn0fc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-235009531',owner_user_name='tempest-FloatingIPsAssociationTestJSON-235009531-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:14:23Z,user_data=None,user_id='756b3c236ab34471af9186439bd20de5',uuid=36552a60-fe1c-495f-bc2d-779bbd623626,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.125 227317 DEBUG nova.network.os_vif_util [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Converting VIF {"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.126 227317 DEBUG nova.network.os_vif_util [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.127 227317 DEBUG nova.objects.instance [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 36552a60-fe1c-495f-bc2d-779bbd623626 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.338 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <uuid>36552a60-fe1c-495f-bc2d-779bbd623626</uuid>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <name>instance-0000000d</name>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-987586838</nova:name>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:14:30</nova:creationTime>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:user uuid="756b3c236ab34471af9186439bd20de5">tempest-FloatingIPsAssociationTestJSON-235009531-project-member</nova:user>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:project uuid="26bc1062b076453599f5be48e7cb8915">tempest-FloatingIPsAssociationTestJSON-235009531</nova:project>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <nova:port uuid="160d3cc7-cc03-492b-9c27-d7ed9d8654a4">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <entry name="serial">36552a60-fe1c-495f-bc2d-779bbd623626</entry>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <entry name="uuid">36552a60-fe1c-495f-bc2d-779bbd623626</entry>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/36552a60-fe1c-495f-bc2d-779bbd623626_disk">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/36552a60-fe1c-495f-bc2d-779bbd623626_disk.config">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:82:f1:0a"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <target dev="tap160d3cc7-cc"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/console.log" append="off"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:14:31 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:14:31 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:14:31 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:14:31 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.339 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Preparing to wait for external event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.340 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.340 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.340 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.341 227317 DEBUG nova.virt.libvirt.vif [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:14:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-987586838',display_name='tempest-FloatingIPsAssociationTestJSON-server-987586838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-987586838',id=13,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26bc1062b076453599f5be48e7cb8915',ramdisk_id='',reservation_id='r-1l1mn0fc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-235009531',owner_user_name='tempest-FloatingIPsAssociationTestJSON-235009531-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:14:23Z,user_data=None,user_id='756b3c236ab34471af9186439bd20de5',uuid=36552a60-fe1c-495f-bc2d-779bbd623626,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.341 227317 DEBUG nova.network.os_vif_util [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Converting VIF {"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.342 227317 DEBUG nova.network.os_vif_util [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.342 227317 DEBUG os_vif [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.343 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.344 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.344 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.348 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.348 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap160d3cc7-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.349 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap160d3cc7-cc, col_values=(('external_ids', {'iface-id': '160d3cc7-cc03-492b-9c27-d7ed9d8654a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:f1:0a', 'vm-uuid': '36552a60-fe1c-495f-bc2d-779bbd623626'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:31 np0005596062 NetworkManager[48993]: <info>  [1769451271.3518] manager: (tap160d3cc7-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.354 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.358 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.360 227317 INFO os_vif [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc')#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.626 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.627 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.627 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] No VIF found with MAC fa:16:3e:82:f1:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.627 227317 INFO nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Using config drive#033[00m
Jan 26 13:14:31 np0005596062 nova_compute[227313]: 2026-01-26 18:14:31.658 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:31.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:32.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:33.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.792 227317 INFO nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Creating config drive at /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/disk.config#033[00m
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.798 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsq16wx3l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.929 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsq16wx3l" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.957 227317 DEBUG nova.storage.rbd_utils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] rbd image 36552a60-fe1c-495f-bc2d-779bbd623626_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.961 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/disk.config 36552a60-fe1c-495f-bc2d-779bbd623626_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.984 227317 DEBUG nova.network.neutron [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updated VIF entry in instance network info cache for port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:14:33 np0005596062 nova_compute[227313]: 2026-01-26 18:14:33.984 227317 DEBUG nova.network.neutron [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updating instance_info_cache with network_info: [{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.079 227317 DEBUG oslo_concurrency.lockutils [req-ace2b6e7-7107-45c9-8043-20fa5d6deb95 req-113687a3-462f-47d2-a740-8791a60d4e40 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:14:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:34.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.366 227317 DEBUG oslo_concurrency.processutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/disk.config 36552a60-fe1c-495f-bc2d-779bbd623626_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.366 227317 INFO nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Deleting local config drive /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626/disk.config because it was imported into RBD.#033[00m
Jan 26 13:14:34 np0005596062 kernel: tap160d3cc7-cc: entered promiscuous mode
Jan 26 13:14:34 np0005596062 NetworkManager[48993]: <info>  [1769451274.4272] manager: (tap160d3cc7-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Jan 26 13:14:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:34Z|00098|binding|INFO|Claiming lport 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 for this chassis.
Jan 26 13:14:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:34Z|00099|binding|INFO|160d3cc7-cc03-492b-9c27-d7ed9d8654a4: Claiming fa:16:3e:82:f1:0a 10.100.0.13
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.426 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.435 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.452 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:f1:0a 10.100.0.13'], port_security=['fa:16:3e:82:f1:0a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '36552a60-fe1c-495f-bc2d-779bbd623626', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26bc1062b076453599f5be48e7cb8915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '452a4723-0213-4c72-977b-973f75cbe2f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d145b5c0-6021-423e-868e-7999d10938ca, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=160d3cc7-cc03-492b-9c27-d7ed9d8654a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.454 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 in datapath 35df8c2b-5669-4fac-ac88-6d982b49ef1b bound to our chassis#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.456 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 35df8c2b-5669-4fac-ac88-6d982b49ef1b#033[00m
Jan 26 13:14:34 np0005596062 systemd-udevd[240829]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:14:34 np0005596062 systemd-machined[195380]: New machine qemu-10-instance-0000000d.
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.471 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccdbff9-907d-4330-962b-bf4822f1c233]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 NetworkManager[48993]: <info>  [1769451274.4739] device (tap160d3cc7-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.472 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap35df8c2b-51 in ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:14:34 np0005596062 NetworkManager[48993]: <info>  [1769451274.4745] device (tap160d3cc7-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.475 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap35df8c2b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.475 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[df27f522-4652-4b1e-ad27-1ce46ed9fa0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.476 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[58cb7f05-805b-4673-9244-2fab8611e9b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.487 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[14ce1cb1-7790-41c0-a019-8d766d8cee02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.509 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:34Z|00100|binding|INFO|Setting lport 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 ovn-installed in OVS
Jan 26 13:14:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:34Z|00101|binding|INFO|Setting lport 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 up in Southbound
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.517 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.515 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b7531d0a-e2f0-4994-97f1-610637799e56]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.547 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[89e1bb2e-f44e-4522-9214-c8e3991790bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.553 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[43e9760c-349f-4595-bf47-4fb936944b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 NetworkManager[48993]: <info>  [1769451274.5553] manager: (tap35df8c2b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Jan 26 13:14:34 np0005596062 systemd-udevd[240833]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.586 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[72786162-bb2b-4922-a9f9-e36f2252b6ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.589 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[825a0782-30b6-49e7-910c-c3574a9c28f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 NetworkManager[48993]: <info>  [1769451274.6105] device (tap35df8c2b-50): carrier: link connected
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.616 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[196ca949-01da-4857-82ba-d91f3fe8cc86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.638 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[854a783b-71d7-47b8-a119-29f88a1e411e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap35df8c2b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:b2:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503466, 'reachable_time': 43302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240863, 'error': None, 'target': 'ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.652 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[86a14a75-ea52-4120-a0f4-28cd9b2b9c99]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe97:b2a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503466, 'tstamp': 503466}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240864, 'error': None, 'target': 'ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.672 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[72be20c5-81fc-4bcb-b8a3-7cf13a904fc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap35df8c2b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:b2:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503466, 'reachable_time': 43302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240865, 'error': None, 'target': 'ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.703 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[898fa9d0-8ba6-4abf-a0bf-4e30b1d785de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.772 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4210cd-14f4-402a-98f0-8c262efa9503]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.774 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35df8c2b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.774 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.774 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap35df8c2b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.776 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 NetworkManager[48993]: <info>  [1769451274.7774] manager: (tap35df8c2b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 26 13:14:34 np0005596062 kernel: tap35df8c2b-50: entered promiscuous mode
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.783 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap35df8c2b-50, col_values=(('external_ids', {'iface-id': 'e7ca5b9f-aa29-4279-8a64-e32d070b5763'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:34 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:34Z|00102|binding|INFO|Releasing lport e7ca5b9f-aa29-4279-8a64-e32d070b5763 from this chassis (sb_readonly=0)
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.784 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.787 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 nova_compute[227313]: 2026-01-26 18:14:34.797 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.798 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/35df8c2b-5669-4fac-ac88-6d982b49ef1b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/35df8c2b-5669-4fac-ac88-6d982b49ef1b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.799 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[71de001f-fe8e-4390-b3a2-5a0f03d1fb5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.799 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-35df8c2b-5669-4fac-ac88-6d982b49ef1b
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/35df8c2b-5669-4fac-ac88-6d982b49ef1b.pid.haproxy
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 35df8c2b-5669-4fac-ac88-6d982b49ef1b
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:14:34 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:34.800 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'env', 'PROCESS_TAG=haproxy-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/35df8c2b-5669-4fac-ac88-6d982b49ef1b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.089 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:35 np0005596062 podman[240897]: 2026-01-26 18:14:35.172290755 +0000 UTC m=+0.063883276 container create 20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 26 13:14:35 np0005596062 systemd[1]: Started libpod-conmon-20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6.scope.
Jan 26 13:14:35 np0005596062 podman[240897]: 2026-01-26 18:14:35.139632958 +0000 UTC m=+0.031225499 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:14:35 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:14:35 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/982b9eda88488de538de8127d298db751048592d0a8841b16e1f4c89266c52c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:14:35 np0005596062 podman[240897]: 2026-01-26 18:14:35.281981935 +0000 UTC m=+0.173574416 container init 20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:14:35 np0005596062 podman[240897]: 2026-01-26 18:14:35.291863418 +0000 UTC m=+0.183455899 container start 20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 13:14:35 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [NOTICE]   (240932) : New worker (240937) forked
Jan 26 13:14:35 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [NOTICE]   (240932) : Loading success.
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.491 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451275.4903407, 36552a60-fe1c-495f-bc2d-779bbd623626 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.491 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] VM Started (Lifecycle Event)#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.520 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.525 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451275.4905457, 36552a60-fe1c-495f-bc2d-779bbd623626 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.525 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.545 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.550 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.578 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:14:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:35.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.748 227317 DEBUG nova.compute.manager [req-23f5489a-a9e1-4b13-8b94-38cafbb7428f req-c7b5c24a-ad4b-4271-a2fa-56322b474f60 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.749 227317 DEBUG oslo_concurrency.lockutils [req-23f5489a-a9e1-4b13-8b94-38cafbb7428f req-c7b5c24a-ad4b-4271-a2fa-56322b474f60 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.749 227317 DEBUG oslo_concurrency.lockutils [req-23f5489a-a9e1-4b13-8b94-38cafbb7428f req-c7b5c24a-ad4b-4271-a2fa-56322b474f60 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.749 227317 DEBUG oslo_concurrency.lockutils [req-23f5489a-a9e1-4b13-8b94-38cafbb7428f req-c7b5c24a-ad4b-4271-a2fa-56322b474f60 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.750 227317 DEBUG nova.compute.manager [req-23f5489a-a9e1-4b13-8b94-38cafbb7428f req-c7b5c24a-ad4b-4271-a2fa-56322b474f60 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Processing event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.750 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.754 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451275.7541418, 36552a60-fe1c-495f-bc2d-779bbd623626 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.754 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.756 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.759 227317 INFO nova.virt.libvirt.driver [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Instance spawned successfully.#033[00m
Jan 26 13:14:35 np0005596062 nova_compute[227313]: 2026-01-26 18:14:35.760 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.095 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.098 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.149 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.149 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.149 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.150 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.150 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.151 227317 DEBUG nova.virt.libvirt.driver [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:14:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:36.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.351 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:36 np0005596062 nova_compute[227313]: 2026-01-26 18:14:36.479 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:14:36 np0005596062 podman[240971]: 2026-01-26 18:14:36.884263449 +0000 UTC m=+0.085687295 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.140 227317 INFO nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Took 13.18 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.140 227317 DEBUG nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:14:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:37.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.792 227317 INFO nova.compute.manager [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Took 16.78 seconds to build instance.#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.818 227317 DEBUG oslo_concurrency.lockutils [None req-74f85672-fb1c-40f8-98ed-3cc4f04dd6c8 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.886 227317 DEBUG nova.compute.manager [req-c946a655-eba5-45d4-af54-cabe7b4d5ec4 req-1982f348-bc93-4c4d-a28a-ccea6e5841c8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.887 227317 DEBUG oslo_concurrency.lockutils [req-c946a655-eba5-45d4-af54-cabe7b4d5ec4 req-1982f348-bc93-4c4d-a28a-ccea6e5841c8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.888 227317 DEBUG oslo_concurrency.lockutils [req-c946a655-eba5-45d4-af54-cabe7b4d5ec4 req-1982f348-bc93-4c4d-a28a-ccea6e5841c8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.888 227317 DEBUG oslo_concurrency.lockutils [req-c946a655-eba5-45d4-af54-cabe7b4d5ec4 req-1982f348-bc93-4c4d-a28a-ccea6e5841c8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.888 227317 DEBUG nova.compute.manager [req-c946a655-eba5-45d4-af54-cabe7b4d5ec4 req-1982f348-bc93-4c4d-a28a-ccea6e5841c8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] No waiting events found dispatching network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:14:37 np0005596062 nova_compute[227313]: 2026-01-26 18:14:37.889 227317 WARNING nova.compute.manager [req-c946a655-eba5-45d4-af54-cabe7b4d5ec4 req-1982f348-bc93-4c4d-a28a-ccea6e5841c8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received unexpected event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:14:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:38.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:39.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:40 np0005596062 nova_compute[227313]: 2026-01-26 18:14:40.091 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:40.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:14:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1691814002' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:14:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:14:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1691814002' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:14:41 np0005596062 nova_compute[227313]: 2026-01-26 18:14:41.353 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:41.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:42.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:43.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:44.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:45 np0005596062 nova_compute[227313]: 2026-01-26 18:14:45.093 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:45.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:46 np0005596062 nova_compute[227313]: 2026-01-26 18:14:46.355 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:46 np0005596062 nova_compute[227313]: 2026-01-26 18:14:46.919 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:46 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:46.919 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:14:46 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:46.921 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:14:46 np0005596062 podman[240996]: 2026-01-26 18:14:46.934047517 +0000 UTC m=+0.133925125 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 13:14:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:47.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:47.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:47 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:14:47.924 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:14:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:49.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:49.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:50 np0005596062 nova_compute[227313]: 2026-01-26 18:14:50.097 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:51.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:51 np0005596062 nova_compute[227313]: 2026-01-26 18:14:51.395 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:51.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:53.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:53.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:53Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:f1:0a 10.100.0.13
Jan 26 13:14:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:14:53Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:f1:0a 10.100.0.13
Jan 26 13:14:55 np0005596062 nova_compute[227313]: 2026-01-26 18:14:55.099 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.232372) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451295233034, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2671, "num_deletes": 504, "total_data_size": 6227435, "memory_usage": 6297192, "flush_reason": "Manual Compaction"}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451295251063, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 2553748, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25714, "largest_seqno": 28379, "table_properties": {"data_size": 2545594, "index_size": 4200, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 23219, "raw_average_key_size": 20, "raw_value_size": 2525926, "raw_average_value_size": 2206, "num_data_blocks": 187, "num_entries": 1145, "num_filter_entries": 1145, "num_deletions": 504, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451085, "oldest_key_time": 1769451085, "file_creation_time": 1769451295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 18851 microseconds, and 6718 cpu microseconds.
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.251230) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 2553748 bytes OK
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.251310) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.253012) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.253031) EVENT_LOG_v1 {"time_micros": 1769451295253025, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.253053) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6214822, prev total WAL file size 6214822, number of live WAL files 2.
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.255018) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(2493KB)], [51(9341KB)]
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451295255065, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 12119092, "oldest_snapshot_seqno": -1}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5204 keys, 7748024 bytes, temperature: kUnknown
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451295307189, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 7748024, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7714642, "index_size": 19256, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13061, "raw_key_size": 131790, "raw_average_key_size": 25, "raw_value_size": 7622079, "raw_average_value_size": 1464, "num_data_blocks": 780, "num_entries": 5204, "num_filter_entries": 5204, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.307478) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 7748024 bytes
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.308660) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 232.2 rd, 148.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 9.1 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(7.8) write-amplify(3.0) OK, records in: 6140, records dropped: 936 output_compression: NoCompression
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.308697) EVENT_LOG_v1 {"time_micros": 1769451295308674, "job": 30, "event": "compaction_finished", "compaction_time_micros": 52203, "compaction_time_cpu_micros": 20119, "output_level": 6, "num_output_files": 1, "total_output_size": 7748024, "num_input_records": 6140, "num_output_records": 5204, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451295309496, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451295312294, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.254875) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.312396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.312402) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.312404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.312405) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:14:55 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:14:55.312407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:14:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:55.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:56 np0005596062 nova_compute[227313]: 2026-01-26 18:14:56.399 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:14:57 np0005596062 nova_compute[227313]: 2026-01-26 18:14:57.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:57 np0005596062 nova_compute[227313]: 2026-01-26 18:14:57.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:14:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:14:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:57.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:14:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:14:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:57.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:14:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.053 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.053 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.054 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:14:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:14:59.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.476 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.476 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.476 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.477 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.477 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:14:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:14:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:14:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:14:59.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:14:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:14:59 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2078137892' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:14:59 np0005596062 nova_compute[227313]: 2026-01-26 18:14:59.939 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:00 np0005596062 nova_compute[227313]: 2026-01-26 18:15:00.102 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:00 np0005596062 nova_compute[227313]: 2026-01-26 18:15:00.195 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.1960] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/54)
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.1974] device (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <warn>  [1769451300.1976] device (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.1990] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/55)
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.1996] device (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <warn>  [1769451300.1998] device (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.2009] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.2018] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.2025] device (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 13:15:00 np0005596062 NetworkManager[48993]: <info>  [1769451300.2031] device (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 13:15:00 np0005596062 nova_compute[227313]: 2026-01-26 18:15:00.426 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:00 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:00Z|00103|binding|INFO|Releasing lport e7ca5b9f-aa29-4279-8a64-e32d070b5763 from this chassis (sb_readonly=0)
Jan 26 13:15:00 np0005596062 nova_compute[227313]: 2026-01-26 18:15:00.454 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:01.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.402 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.627 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.627 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:15:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:01.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.814 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.816 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4629MB free_disk=20.94293212890625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.816 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:01 np0005596062 nova_compute[227313]: 2026-01-26 18:15:01.817 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:02 np0005596062 nova_compute[227313]: 2026-01-26 18:15:02.960 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 36552a60-fe1c-495f-bc2d-779bbd623626 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:15:02 np0005596062 nova_compute[227313]: 2026-01-26 18:15:02.961 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:15:02 np0005596062 nova_compute[227313]: 2026-01-26 18:15:02.961 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.004 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:03.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.184 227317 DEBUG nova.compute.manager [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-changed-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.185 227317 DEBUG nova.compute.manager [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Refreshing instance network info cache due to event network-changed-160d3cc7-cc03-492b-9c27-d7ed9d8654a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.186 227317 DEBUG oslo_concurrency.lockutils [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.186 227317 DEBUG oslo_concurrency.lockutils [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.187 227317 DEBUG nova.network.neutron [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Refreshing network info cache for port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:15:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:15:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1223137234' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.526 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.532 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:15:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:03.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:03 np0005596062 nova_compute[227313]: 2026-01-26 18:15:03.944 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:15:04 np0005596062 nova_compute[227313]: 2026-01-26 18:15:04.152 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:15:04 np0005596062 nova_compute[227313]: 2026-01-26 18:15:04.152 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:05 np0005596062 nova_compute[227313]: 2026-01-26 18:15:05.104 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:05.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:05.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:06 np0005596062 nova_compute[227313]: 2026-01-26 18:15:06.153 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:06 np0005596062 nova_compute[227313]: 2026-01-26 18:15:06.154 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:06 np0005596062 nova_compute[227313]: 2026-01-26 18:15:06.154 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:15:06 np0005596062 nova_compute[227313]: 2026-01-26 18:15:06.155 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:15:06 np0005596062 nova_compute[227313]: 2026-01-26 18:15:06.405 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:06 np0005596062 nova_compute[227313]: 2026-01-26 18:15:06.897 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:15:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:07.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:07.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:07 np0005596062 podman[241129]: 2026-01-26 18:15:07.843748643 +0000 UTC m=+0.050637714 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 13:15:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:07 np0005596062 nova_compute[227313]: 2026-01-26 18:15:07.949 227317 DEBUG nova.network.neutron [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updated VIF entry in instance network info cache for port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:15:07 np0005596062 nova_compute[227313]: 2026-01-26 18:15:07.950 227317 DEBUG nova.network.neutron [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updating instance_info_cache with network_info: [{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:15:07 np0005596062 nova_compute[227313]: 2026-01-26 18:15:07.975 227317 DEBUG oslo_concurrency.lockutils [req-ba59c13a-b105-425c-a5b2-720ec4b75e54 req-06b23da7-3f21-4aff-8f39-87923573848f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:15:07 np0005596062 nova_compute[227313]: 2026-01-26 18:15:07.976 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:15:07 np0005596062 nova_compute[227313]: 2026-01-26 18:15:07.976 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:15:07 np0005596062 nova_compute[227313]: 2026-01-26 18:15:07.976 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 36552a60-fe1c-495f-bc2d-779bbd623626 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:15:08 np0005596062 nova_compute[227313]: 2026-01-26 18:15:08.237 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:09.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:09.164 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:09.165 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:09.165 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:09Z|00104|binding|INFO|Releasing lport e7ca5b9f-aa29-4279-8a64-e32d070b5763 from this chassis (sb_readonly=0)
Jan 26 13:15:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:09.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.731 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updating instance_info_cache with network_info: [{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.744 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.767 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.768 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.769 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.769 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:09 np0005596062 nova_compute[227313]: 2026-01-26 18:15:09.769 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:10 np0005596062 nova_compute[227313]: 2026-01-26 18:15:10.107 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:10 np0005596062 nova_compute[227313]: 2026-01-26 18:15:10.557 227317 DEBUG nova.compute.manager [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-changed-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:10 np0005596062 nova_compute[227313]: 2026-01-26 18:15:10.558 227317 DEBUG nova.compute.manager [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Refreshing instance network info cache due to event network-changed-160d3cc7-cc03-492b-9c27-d7ed9d8654a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:15:10 np0005596062 nova_compute[227313]: 2026-01-26 18:15:10.559 227317 DEBUG oslo_concurrency.lockutils [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:15:10 np0005596062 nova_compute[227313]: 2026-01-26 18:15:10.559 227317 DEBUG oslo_concurrency.lockutils [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:15:10 np0005596062 nova_compute[227313]: 2026-01-26 18:15:10.559 227317 DEBUG nova.network.neutron [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Refreshing network info cache for port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:15:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:11.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:11 np0005596062 nova_compute[227313]: 2026-01-26 18:15:11.408 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:11.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:13.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:13.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:14 np0005596062 nova_compute[227313]: 2026-01-26 18:15:14.953 227317 DEBUG nova.network.neutron [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updated VIF entry in instance network info cache for port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:15:14 np0005596062 nova_compute[227313]: 2026-01-26 18:15:14.953 227317 DEBUG nova.network.neutron [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updating instance_info_cache with network_info: [{"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:15:15 np0005596062 nova_compute[227313]: 2026-01-26 18:15:15.110 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:15.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:15 np0005596062 nova_compute[227313]: 2026-01-26 18:15:15.227 227317 DEBUG oslo_concurrency.lockutils [req-de284df2-aec3-450b-b024-813334ecdb6c req-37151db2-ad85-419e-9ee7-089735216d9e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-36552a60-fe1c-495f-bc2d-779bbd623626" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:15:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:15.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:16 np0005596062 nova_compute[227313]: 2026-01-26 18:15:16.411 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.266 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.267 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.268 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.268 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.268 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.270 227317 INFO nova.compute.manager [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Terminating instance#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.271 227317 DEBUG nova.compute.manager [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:15:17 np0005596062 kernel: tap160d3cc7-cc (unregistering): left promiscuous mode
Jan 26 13:15:17 np0005596062 NetworkManager[48993]: <info>  [1769451317.3368] device (tap160d3cc7-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.347 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:17Z|00105|binding|INFO|Releasing lport 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 from this chassis (sb_readonly=0)
Jan 26 13:15:17 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:17Z|00106|binding|INFO|Setting lport 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 down in Southbound
Jan 26 13:15:17 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:17Z|00107|binding|INFO|Removing iface tap160d3cc7-cc ovn-installed in OVS
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.350 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.355 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:f1:0a 10.100.0.13'], port_security=['fa:16:3e:82:f1:0a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '36552a60-fe1c-495f-bc2d-779bbd623626', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26bc1062b076453599f5be48e7cb8915', 'neutron:revision_number': '4', 'neutron:security_group_ids': '452a4723-0213-4c72-977b-973f75cbe2f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d145b5c0-6021-423e-868e-7999d10938ca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=160d3cc7-cc03-492b-9c27-d7ed9d8654a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.357 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 160d3cc7-cc03-492b-9c27-d7ed9d8654a4 in datapath 35df8c2b-5669-4fac-ac88-6d982b49ef1b unbound from our chassis#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.359 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 35df8c2b-5669-4fac-ac88-6d982b49ef1b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.360 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f94e1a53-8f7f-4190-8eea-13fe366d5c80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.361 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b namespace which is not needed anymore#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.369 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 26 13:15:17 np0005596062 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 15.780s CPU time.
Jan 26 13:15:17 np0005596062 systemd-machined[195380]: Machine qemu-10-instance-0000000d terminated.
Jan 26 13:15:17 np0005596062 podman[241335]: 2026-01-26 18:15:17.493527365 +0000 UTC m=+0.131870390 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.497 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [NOTICE]   (240932) : haproxy version is 2.8.14-c23fe91
Jan 26 13:15:17 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [NOTICE]   (240932) : path to executable is /usr/sbin/haproxy
Jan 26 13:15:17 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [WARNING]  (240932) : Exiting Master process...
Jan 26 13:15:17 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [ALERT]    (240932) : Current worker (240937) exited with code 143 (Terminated)
Jan 26 13:15:17 np0005596062 neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b[240913]: [WARNING]  (240932) : All workers exited. Exiting... (0)
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.504 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 systemd[1]: libpod-20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6.scope: Deactivated successfully.
Jan 26 13:15:17 np0005596062 podman[241383]: 2026-01-26 18:15:17.511422709 +0000 UTC m=+0.052178205 container died 20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.512 227317 INFO nova.virt.libvirt.driver [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Instance destroyed successfully.#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.512 227317 DEBUG nova.objects.instance [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lazy-loading 'resources' on Instance uuid 36552a60-fe1c-495f-bc2d-779bbd623626 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.536 227317 DEBUG nova.virt.libvirt.vif [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:14:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-987586838',display_name='tempest-FloatingIPsAssociationTestJSON-server-987586838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-987586838',id=13,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:14:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='26bc1062b076453599f5be48e7cb8915',ramdisk_id='',reservation_id='r-1l1mn0fc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-235009531',owner_user_name='tempest-FloatingIPsAssociationTestJSON-235009531-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:14:37Z,user_data=None,user_id='756b3c236ab34471af9186439bd20de5',uuid=36552a60-fe1c-495f-bc2d-779bbd623626,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.538 227317 DEBUG nova.network.os_vif_util [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Converting VIF {"id": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "address": "fa:16:3e:82:f1:0a", "network": {"id": "35df8c2b-5669-4fac-ac88-6d982b49ef1b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-17689225-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "26bc1062b076453599f5be48e7cb8915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160d3cc7-cc", "ovs_interfaceid": "160d3cc7-cc03-492b-9c27-d7ed9d8654a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.539 227317 DEBUG nova.network.os_vif_util [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.539 227317 DEBUG os_vif [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.541 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.542 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap160d3cc7-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:17 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6-userdata-shm.mount: Deactivated successfully.
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.544 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.545 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 systemd[1]: var-lib-containers-storage-overlay-982b9eda88488de538de8127d298db751048592d0a8841b16e1f4c89266c52c9-merged.mount: Deactivated successfully.
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.552 227317 INFO os_vif [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:f1:0a,bridge_name='br-int',has_traffic_filtering=True,id=160d3cc7-cc03-492b-9c27-d7ed9d8654a4,network=Network(35df8c2b-5669-4fac-ac88-6d982b49ef1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160d3cc7-cc')#033[00m
Jan 26 13:15:17 np0005596062 podman[241383]: 2026-01-26 18:15:17.560551202 +0000 UTC m=+0.101306698 container cleanup 20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:15:17 np0005596062 systemd[1]: libpod-conmon-20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6.scope: Deactivated successfully.
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.631 227317 DEBUG nova.compute.manager [req-4113ae14-766d-4c75-b0e4-01e956179f8c req-84abb0a9-9b27-4278-93c4-779ef38a6979 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-vif-unplugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.632 227317 DEBUG oslo_concurrency.lockutils [req-4113ae14-766d-4c75-b0e4-01e956179f8c req-84abb0a9-9b27-4278-93c4-779ef38a6979 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.632 227317 DEBUG oslo_concurrency.lockutils [req-4113ae14-766d-4c75-b0e4-01e956179f8c req-84abb0a9-9b27-4278-93c4-779ef38a6979 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.632 227317 DEBUG oslo_concurrency.lockutils [req-4113ae14-766d-4c75-b0e4-01e956179f8c req-84abb0a9-9b27-4278-93c4-779ef38a6979 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.633 227317 DEBUG nova.compute.manager [req-4113ae14-766d-4c75-b0e4-01e956179f8c req-84abb0a9-9b27-4278-93c4-779ef38a6979 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] No waiting events found dispatching network-vif-unplugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.633 227317 DEBUG nova.compute.manager [req-4113ae14-766d-4c75-b0e4-01e956179f8c req-84abb0a9-9b27-4278-93c4-779ef38a6979 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-vif-unplugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:15:17 np0005596062 podman[241439]: 2026-01-26 18:15:17.634140065 +0000 UTC m=+0.044613555 container remove 20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.640 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1a50e0b3-791c-4164-8ec5-69cf860fe68f]: (4, ('Mon Jan 26 06:15:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b (20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6)\n20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6\nMon Jan 26 06:15:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b (20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6)\n20c8273a26a86ba3b36f820515bacb1c64cbb858c5cdd8dca70d82039b0ee0a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.643 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a56da01b-2cf9-43e0-b237-fa9b607458b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.644 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35df8c2b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.646 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 kernel: tap35df8c2b-50: left promiscuous mode
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.658 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.660 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.664 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c5ac5e-8b38-4232-99ee-9b859fe43952]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.681 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e0f723-d7f8-44d9-80bb-5c01c97e6d9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.683 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bed26ad8-bc87-41c9-a6f5-ac9a8801d36e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.701 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[14a7dc91-0b4f-4e1e-b0e7-70a1af0411ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503459, 'reachable_time': 37469, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241457, 'error': None, 'target': 'ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 systemd[1]: run-netns-ovnmeta\x2d35df8c2b\x2d5669\x2d4fac\x2dac88\x2d6d982b49ef1b.mount: Deactivated successfully.
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.707 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-35df8c2b-5669-4fac-ac88-6d982b49ef1b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:15:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:17.707 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[5816f546-986a-4182-9fcc-efdcf634cf68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:17.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:17 np0005596062 nova_compute[227313]: 2026-01-26 18:15:17.768 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.852296) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451317852361, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 489, "num_deletes": 258, "total_data_size": 581162, "memory_usage": 591256, "flush_reason": "Manual Compaction"}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451317856953, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 383027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28384, "largest_seqno": 28868, "table_properties": {"data_size": 380410, "index_size": 653, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6128, "raw_average_key_size": 17, "raw_value_size": 375063, "raw_average_value_size": 1080, "num_data_blocks": 29, "num_entries": 347, "num_filter_entries": 347, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451296, "oldest_key_time": 1769451296, "file_creation_time": 1769451317, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 5221 microseconds, and 2048 cpu microseconds.
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.856986) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 383027 bytes OK
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.857548) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.858597) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.858610) EVENT_LOG_v1 {"time_micros": 1769451317858607, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.858652) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 578189, prev total WAL file size 578189, number of live WAL files 2.
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.859098) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373536' seq:0, type:0; will stop at (end)
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(374KB)], [54(7566KB)]
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451317859231, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 8131051, "oldest_snapshot_seqno": -1}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5025 keys, 8041734 bytes, temperature: kUnknown
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451317915669, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 8041734, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8008544, "index_size": 19512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 129319, "raw_average_key_size": 25, "raw_value_size": 7918140, "raw_average_value_size": 1575, "num_data_blocks": 788, "num_entries": 5025, "num_filter_entries": 5025, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451317, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.916041) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 8041734 bytes
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.917220) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 144.0 rd, 142.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 7.4 +0.0 blob) out(7.7 +0.0 blob), read-write-amplify(42.2) write-amplify(21.0) OK, records in: 5551, records dropped: 526 output_compression: NoCompression
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.917238) EVENT_LOG_v1 {"time_micros": 1769451317917228, "job": 32, "event": "compaction_finished", "compaction_time_micros": 56457, "compaction_time_cpu_micros": 23921, "output_level": 6, "num_output_files": 1, "total_output_size": 8041734, "num_input_records": 5551, "num_output_records": 5025, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451317917423, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451317918666, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.859011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.918834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.918843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.918845) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.918846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:15:17 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:15:17.918848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.076 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.405 227317 INFO nova.virt.libvirt.driver [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Deleting instance files /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626_del#033[00m
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.406 227317 INFO nova.virt.libvirt.driver [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Deletion of /var/lib/nova/instances/36552a60-fe1c-495f-bc2d-779bbd623626_del complete#033[00m
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.465 227317 INFO nova.compute.manager [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Took 1.19 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.465 227317 DEBUG oslo.service.loopingcall [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.466 227317 DEBUG nova.compute.manager [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:15:18 np0005596062 nova_compute[227313]: 2026-01-26 18:15:18.466 227317 DEBUG nova.network.neutron [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:15:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:19.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.395 227317 DEBUG nova.network.neutron [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.425 227317 INFO nova.compute.manager [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Took 0.96 seconds to deallocate network for instance.#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.478 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.479 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.535 227317 DEBUG nova.compute.manager [req-8c8fd230-dcc3-48d3-836b-deffc08f5f69 req-5ec51c93-7b32-49c6-aebf-6bc41d448270 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-vif-deleted-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.553 227317 DEBUG oslo_concurrency.processutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:19.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.813 227317 DEBUG nova.compute.manager [req-1c4f35f0-381f-427a-a221-564bfffd7241 req-cad6ae6e-7346-40e5-947f-99a2177d4d46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.814 227317 DEBUG oslo_concurrency.lockutils [req-1c4f35f0-381f-427a-a221-564bfffd7241 req-cad6ae6e-7346-40e5-947f-99a2177d4d46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.814 227317 DEBUG oslo_concurrency.lockutils [req-1c4f35f0-381f-427a-a221-564bfffd7241 req-cad6ae6e-7346-40e5-947f-99a2177d4d46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.815 227317 DEBUG oslo_concurrency.lockutils [req-1c4f35f0-381f-427a-a221-564bfffd7241 req-cad6ae6e-7346-40e5-947f-99a2177d4d46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.815 227317 DEBUG nova.compute.manager [req-1c4f35f0-381f-427a-a221-564bfffd7241 req-cad6ae6e-7346-40e5-947f-99a2177d4d46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] No waiting events found dispatching network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:15:19 np0005596062 nova_compute[227313]: 2026-01-26 18:15:19.815 227317 WARNING nova.compute.manager [req-1c4f35f0-381f-427a-a221-564bfffd7241 req-cad6ae6e-7346-40e5-947f-99a2177d4d46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Received unexpected event network-vif-plugged-160d3cc7-cc03-492b-9c27-d7ed9d8654a4 for instance with vm_state deleted and task_state None.#033[00m
Jan 26 13:15:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:15:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1084996109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.038 227317 DEBUG oslo_concurrency.processutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.045 227317 DEBUG nova.compute.provider_tree [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.063 227317 DEBUG nova.scheduler.client.report [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.112 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.117 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.166 227317 INFO nova.scheduler.client.report [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Deleted allocations for instance 36552a60-fe1c-495f-bc2d-779bbd623626#033[00m
Jan 26 13:15:20 np0005596062 nova_compute[227313]: 2026-01-26 18:15:20.324 227317 DEBUG oslo_concurrency.lockutils [None req-86640507-699c-4d32-90d3-3c76a5d291cc 756b3c236ab34471af9186439bd20de5 26bc1062b076453599f5be48e7cb8915 - - default default] Lock "36552a60-fe1c-495f-bc2d-779bbd623626" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:21.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:21.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:22 np0005596062 nova_compute[227313]: 2026-01-26 18:15:22.546 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:23.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:23.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:24.843 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:15:24 np0005596062 nova_compute[227313]: 2026-01-26 18:15:24.844 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:24.845 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:15:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:24.846 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:25 np0005596062 nova_compute[227313]: 2026-01-26 18:15:25.114 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:25.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:25 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:15:25 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:15:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:25.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 26 13:15:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:27.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:27 np0005596062 nova_compute[227313]: 2026-01-26 18:15:27.548 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:27.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:29.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:29.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:30 np0005596062 nova_compute[227313]: 2026-01-26 18:15:30.116 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 26 13:15:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:31.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:31.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:32 np0005596062 nova_compute[227313]: 2026-01-26 18:15:32.511 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451317.5094833, 36552a60-fe1c-495f-bc2d-779bbd623626 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:15:32 np0005596062 nova_compute[227313]: 2026-01-26 18:15:32.512 227317 INFO nova.compute.manager [-] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:15:32 np0005596062 nova_compute[227313]: 2026-01-26 18:15:32.541 227317 DEBUG nova.compute.manager [None req-3835ee9c-3488-4927-8eeb-b470a8b02d29 - - - - - -] [instance: 36552a60-fe1c-495f-bc2d-779bbd623626] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:32 np0005596062 nova_compute[227313]: 2026-01-26 18:15:32.552 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:33.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:33.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:35 np0005596062 nova_compute[227313]: 2026-01-26 18:15:35.118 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:35.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:35.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:37.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:37 np0005596062 nova_compute[227313]: 2026-01-26 18:15:37.555 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:37.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 26 13:15:38 np0005596062 podman[241592]: 2026-01-26 18:15:38.857115197 +0000 UTC m=+0.061959515 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 13:15:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:39.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:39.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:40 np0005596062 nova_compute[227313]: 2026-01-26 18:15:40.145 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:41.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:41.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:42 np0005596062 nova_compute[227313]: 2026-01-26 18:15:42.590 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:15:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:43.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:15:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:43.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:44 np0005596062 nova_compute[227313]: 2026-01-26 18:15:44.841 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:44 np0005596062 nova_compute[227313]: 2026-01-26 18:15:44.841 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:44 np0005596062 nova_compute[227313]: 2026-01-26 18:15:44.908 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.003 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.004 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.015 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.015 227317 INFO nova.compute.claims [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.126 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.154 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:45.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:45 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:15:45 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1916504275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.622 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.629 227317 DEBUG nova.compute.provider_tree [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:15:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:45.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.853 227317 DEBUG nova.scheduler.client.report [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.915 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:45 np0005596062 nova_compute[227313]: 2026-01-26 18:15:45.916 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.013 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.014 227317 DEBUG nova.network.neutron [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.119 227317 INFO nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.157 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.640 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.642 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.642 227317 INFO nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Creating image(s)#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.669 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.701 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.728 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.732 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.766 227317 DEBUG nova.policy [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '995bf2bf54f64b3490e7b8e751aedb77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b5b72f4b8394daba5420fe9fc17a7bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.805 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.806 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.807 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.807 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.832 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:46 np0005596062 nova_compute[227313]: 2026-01-26 18:15:46.838 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 0ba43746-10bc-41bd-aa56-246af8723901_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:47.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:47 np0005596062 nova_compute[227313]: 2026-01-26 18:15:47.360 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 0ba43746-10bc-41bd-aa56-246af8723901_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:47 np0005596062 nova_compute[227313]: 2026-01-26 18:15:47.445 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] resizing rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:15:47 np0005596062 nova_compute[227313]: 2026-01-26 18:15:47.593 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:47.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:47 np0005596062 podman[241789]: 2026-01-26 18:15:47.932382821 +0000 UTC m=+0.137017096 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:15:47 np0005596062 nova_compute[227313]: 2026-01-26 18:15:47.978 227317 DEBUG nova.objects.instance [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lazy-loading 'migration_context' on Instance uuid 0ba43746-10bc-41bd-aa56-246af8723901 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:15:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:48 np0005596062 nova_compute[227313]: 2026-01-26 18:15:48.177 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:15:48 np0005596062 nova_compute[227313]: 2026-01-26 18:15:48.178 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Ensure instance console log exists: /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:15:48 np0005596062 nova_compute[227313]: 2026-01-26 18:15:48.178 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:48 np0005596062 nova_compute[227313]: 2026-01-26 18:15:48.178 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:48 np0005596062 nova_compute[227313]: 2026-01-26 18:15:48.179 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:48 np0005596062 nova_compute[227313]: 2026-01-26 18:15:48.562 227317 DEBUG nova.network.neutron [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Successfully created port: 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:15:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:49.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:49.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:50 np0005596062 nova_compute[227313]: 2026-01-26 18:15:50.149 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.205 227317 DEBUG nova.network.neutron [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Successfully updated port: 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:15:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:51.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.555 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.555 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquired lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.556 227317 DEBUG nova.network.neutron [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.778 227317 DEBUG nova.compute.manager [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-changed-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.779 227317 DEBUG nova.compute.manager [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Refreshing instance network info cache due to event network-changed-082b9c3e-3355-4d4d-ac73-b0b4fd981f48. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:15:51 np0005596062 nova_compute[227313]: 2026-01-26 18:15:51.780 227317 DEBUG oslo_concurrency.lockutils [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:15:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:51.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:52 np0005596062 nova_compute[227313]: 2026-01-26 18:15:52.070 227317 DEBUG nova.network.neutron [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:15:52 np0005596062 nova_compute[227313]: 2026-01-26 18:15:52.597 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.019 227317 DEBUG nova.network.neutron [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updating instance_info_cache with network_info: [{"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:15:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.137 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Releasing lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.138 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Instance network_info: |[{"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.139 227317 DEBUG oslo_concurrency.lockutils [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.140 227317 DEBUG nova.network.neutron [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Refreshing network info cache for port 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.144 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Start _get_guest_xml network_info=[{"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.152 227317 WARNING nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.158 227317 DEBUG nova.virt.libvirt.host [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.159 227317 DEBUG nova.virt.libvirt.host [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.164 227317 DEBUG nova.virt.libvirt.host [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.165 227317 DEBUG nova.virt.libvirt.host [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.167 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.168 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.169 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.169 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.170 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.170 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.171 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.171 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.172 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.172 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.173 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.173 227317 DEBUG nova.virt.hardware [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.178 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:53.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:15:53 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2888838345' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.706 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.737 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:53 np0005596062 nova_compute[227313]: 2026-01-26 18:15:53.741 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:53.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:54 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:15:54 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/259438054' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.194 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.196 227317 DEBUG nova.virt.libvirt.vif [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:15:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-431958299',display_name='tempest-ImagesTestJSON-server-431958299',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-431958299',id=14,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b5b72f4b8394daba5420fe9fc17a7bb',ramdisk_id='',reservation_id='r-al23wrd7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-596605936',owner_user_name='tempest-ImagesTestJSON-596605936-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:15:46Z,user_data=None,user_id='995bf2bf54f64b3490e7b8e751aedb77',uuid=0ba43746-10bc-41bd-aa56-246af8723901,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.196 227317 DEBUG nova.network.os_vif_util [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Converting VIF {"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.197 227317 DEBUG nova.network.os_vif_util [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.198 227317 DEBUG nova.objects.instance [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lazy-loading 'pci_devices' on Instance uuid 0ba43746-10bc-41bd-aa56-246af8723901 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.233 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <uuid>0ba43746-10bc-41bd-aa56-246af8723901</uuid>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <name>instance-0000000e</name>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:name>tempest-ImagesTestJSON-server-431958299</nova:name>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:15:53</nova:creationTime>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:user uuid="995bf2bf54f64b3490e7b8e751aedb77">tempest-ImagesTestJSON-596605936-project-member</nova:user>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:project uuid="0b5b72f4b8394daba5420fe9fc17a7bb">tempest-ImagesTestJSON-596605936</nova:project>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <nova:port uuid="082b9c3e-3355-4d4d-ac73-b0b4fd981f48">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <entry name="serial">0ba43746-10bc-41bd-aa56-246af8723901</entry>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <entry name="uuid">0ba43746-10bc-41bd-aa56-246af8723901</entry>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/0ba43746-10bc-41bd-aa56-246af8723901_disk">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/0ba43746-10bc-41bd-aa56-246af8723901_disk.config">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:a9:2e:64"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <target dev="tap082b9c3e-33"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/console.log" append="off"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:15:54 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:15:54 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:15:54 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:15:54 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.235 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Preparing to wait for external event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.236 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.236 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.236 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.237 227317 DEBUG nova.virt.libvirt.vif [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:15:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-431958299',display_name='tempest-ImagesTestJSON-server-431958299',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-431958299',id=14,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b5b72f4b8394daba5420fe9fc17a7bb',ramdisk_id='',reservation_id='r-al23wrd7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-596605936',owner_user_name='tempest-ImagesTestJSON-596605936-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:15:46Z,user_data=None,user_id='995bf2bf54f64b3490e7b8e751aedb77',uuid=0ba43746-10bc-41bd-aa56-246af8723901,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.237 227317 DEBUG nova.network.os_vif_util [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Converting VIF {"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.238 227317 DEBUG nova.network.os_vif_util [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.238 227317 DEBUG os_vif [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.239 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.239 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.240 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.244 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.244 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap082b9c3e-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.245 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap082b9c3e-33, col_values=(('external_ids', {'iface-id': '082b9c3e-3355-4d4d-ac73-b0b4fd981f48', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:2e:64', 'vm-uuid': '0ba43746-10bc-41bd-aa56-246af8723901'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.247 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:54 np0005596062 NetworkManager[48993]: <info>  [1769451354.2481] manager: (tap082b9c3e-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.249 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.252 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.253 227317 INFO os_vif [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33')#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.629 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.630 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.630 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] No VIF found with MAC fa:16:3e:a9:2e:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.630 227317 INFO nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Using config drive#033[00m
Jan 26 13:15:54 np0005596062 nova_compute[227313]: 2026-01-26 18:15:54.657 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:55 np0005596062 nova_compute[227313]: 2026-01-26 18:15:55.187 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:15:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:55.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:15:55 np0005596062 nova_compute[227313]: 2026-01-26 18:15:55.719 227317 INFO nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Creating config drive at /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/disk.config#033[00m
Jan 26 13:15:55 np0005596062 nova_compute[227313]: 2026-01-26 18:15:55.726 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp83t4d8c7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:55.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:55 np0005596062 nova_compute[227313]: 2026-01-26 18:15:55.862 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp83t4d8c7" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:55 np0005596062 nova_compute[227313]: 2026-01-26 18:15:55.895 227317 DEBUG nova.storage.rbd_utils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] rbd image 0ba43746-10bc-41bd-aa56-246af8723901_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:15:55 np0005596062 nova_compute[227313]: 2026-01-26 18:15:55.900 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/disk.config 0ba43746-10bc-41bd-aa56-246af8723901_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.314 227317 DEBUG oslo_concurrency.processutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/disk.config 0ba43746-10bc-41bd-aa56-246af8723901_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.315 227317 INFO nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Deleting local config drive /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901/disk.config because it was imported into RBD.#033[00m
Jan 26 13:15:56 np0005596062 kernel: tap082b9c3e-33: entered promiscuous mode
Jan 26 13:15:56 np0005596062 NetworkManager[48993]: <info>  [1769451356.3815] manager: (tap082b9c3e-33): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Jan 26 13:15:56 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:56Z|00108|binding|INFO|Claiming lport 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 for this chassis.
Jan 26 13:15:56 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:56Z|00109|binding|INFO|082b9c3e-3355-4d4d-ac73-b0b4fd981f48: Claiming fa:16:3e:a9:2e:64 10.100.0.6
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.382 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.388 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.406 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:2e:64 10.100.0.6'], port_security=['fa:16:3e:a9:2e:64 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0ba43746-10bc-41bd-aa56-246af8723901', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b5b72f4b8394daba5420fe9fc17a7bb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aecc7901-6616-49a6-9a43-16b27d3342ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e18fe23-3bd8-404b-8c53-ea3125fa18eb, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=082b9c3e-3355-4d4d-ac73-b0b4fd981f48) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.409 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 in datapath ef015cda-1a8b-490a-a7b2-b92e5cef4798 bound to our chassis#033[00m
Jan 26 13:15:56 np0005596062 systemd-udevd[242023]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.413 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef015cda-1a8b-490a-a7b2-b92e5cef4798#033[00m
Jan 26 13:15:56 np0005596062 NetworkManager[48993]: <info>  [1769451356.4246] device (tap082b9c3e-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:15:56 np0005596062 NetworkManager[48993]: <info>  [1769451356.4255] device (tap082b9c3e-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:15:56 np0005596062 systemd-machined[195380]: New machine qemu-11-instance-0000000e.
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.430 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[02e6ea97-2e05-4118-8411-def764f35572]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.432 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef015cda-11 in ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.435 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef015cda-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.436 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[71f2262a-99cc-4157-acb2-397108d52f88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.437 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[de419f54-c22f-43f6-88e6-afeb8fc777e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.445 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.453 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.453 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[956e547a-471d-4ed9-a459-c8eaa6fbf01f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 systemd[1]: Started Virtual Machine qemu-11-instance-0000000e.
Jan 26 13:15:56 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:56Z|00110|binding|INFO|Setting lport 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 ovn-installed in OVS
Jan 26 13:15:56 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:56Z|00111|binding|INFO|Setting lport 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 up in Southbound
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.458 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.472 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b7f254-be1d-436f-96c8-bfd4ebcafddf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.505 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ff77de-bba5-4ddd-b4e1-46dc06cfc278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 NetworkManager[48993]: <info>  [1769451356.5143] manager: (tapef015cda-10): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.514 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c354cd76-5d8e-47b3-a46a-843d13debaac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.557 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2e4523-96cb-4c2f-a456-2590d0fe5b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.561 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[23999d53-0f6a-4439-803a-c693c2d64740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 NetworkManager[48993]: <info>  [1769451356.5871] device (tapef015cda-10): carrier: link connected
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.591 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[08f13957-d62b-436e-823b-fdc89955dd40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.613 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[386adc59-d7f3-46c2-8428-3acd087a0b8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef015cda-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:74:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511663, 'reachable_time': 42420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242058, 'error': None, 'target': 'ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.628 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6261aafc-de0b-4a81-aeab-332101e7ed07]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:7444'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511663, 'tstamp': 511663}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242059, 'error': None, 'target': 'ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.652 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[db0c4980-934b-468e-99e3-66feae550c44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef015cda-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:74:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511663, 'reachable_time': 42420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242060, 'error': None, 'target': 'ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.689 227317 DEBUG nova.network.neutron [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updated VIF entry in instance network info cache for port 082b9c3e-3355-4d4d-ac73-b0b4fd981f48. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.689 227317 DEBUG nova.network.neutron [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updating instance_info_cache with network_info: [{"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.693 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5bcdcc69-38a5-4256-a6d2-dade54af5a2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.719 227317 DEBUG oslo_concurrency.lockutils [req-542408c8-2944-453a-a1a9-9a483abf3fff req-9867a23b-d765-41de-b3d7-26e90b10795a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.768 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0c88e2a0-cc4e-4a91-9ca6-881aa3271606]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.771 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef015cda-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.771 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.772 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef015cda-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.774 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:56 np0005596062 kernel: tapef015cda-10: entered promiscuous mode
Jan 26 13:15:56 np0005596062 NetworkManager[48993]: <info>  [1769451356.7755] manager: (tapef015cda-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.778 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef015cda-10, col_values=(('external_ids', {'iface-id': 'bcd6967a-bd14-427a-82a0-470ca8066f39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:15:56 np0005596062 ovn_controller[133984]: 2026-01-26T18:15:56Z|00112|binding|INFO|Releasing lport bcd6967a-bd14-427a-82a0-470ca8066f39 from this chassis (sb_readonly=0)
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.783 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef015cda-1a8b-490a-a7b2-b92e5cef4798.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef015cda-1a8b-490a-a7b2-b92e5cef4798.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.784 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a613e16e-fb57-4165-b96f-e699125f3e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.786 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-ef015cda-1a8b-490a-a7b2-b92e5cef4798
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/ef015cda-1a8b-490a-a7b2-b92e5cef4798.pid.haproxy
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID ef015cda-1a8b-490a-a7b2-b92e5cef4798
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:15:56 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:15:56.787 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'env', 'PROCESS_TAG=haproxy-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef015cda-1a8b-490a-a7b2-b92e5cef4798.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:15:56 np0005596062 nova_compute[227313]: 2026-01-26 18:15:56.797 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.008 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451357.0081089, 0ba43746-10bc-41bd-aa56-246af8723901 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.009 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] VM Started (Lifecycle Event)#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.035 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.040 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451357.008294, 0ba43746-10bc-41bd-aa56-246af8723901 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.040 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.099 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.106 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.152 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.189 227317 DEBUG nova.compute.manager [req-efec2cca-e8ad-4493-93ef-59bd5352b51f req-0b6c3904-6a6c-4bcd-9c2a-fed37e868213 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.189 227317 DEBUG oslo_concurrency.lockutils [req-efec2cca-e8ad-4493-93ef-59bd5352b51f req-0b6c3904-6a6c-4bcd-9c2a-fed37e868213 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.190 227317 DEBUG oslo_concurrency.lockutils [req-efec2cca-e8ad-4493-93ef-59bd5352b51f req-0b6c3904-6a6c-4bcd-9c2a-fed37e868213 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.190 227317 DEBUG oslo_concurrency.lockutils [req-efec2cca-e8ad-4493-93ef-59bd5352b51f req-0b6c3904-6a6c-4bcd-9c2a-fed37e868213 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.190 227317 DEBUG nova.compute.manager [req-efec2cca-e8ad-4493-93ef-59bd5352b51f req-0b6c3904-6a6c-4bcd-9c2a-fed37e868213 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Processing event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.191 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:15:57 np0005596062 podman[242134]: 2026-01-26 18:15:57.191674707 +0000 UTC m=+0.059758087 container create 41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.196 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.197 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451357.19602, 0ba43746-10bc-41bd-aa56-246af8723901 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.197 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.203 227317 INFO nova.virt.libvirt.driver [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Instance spawned successfully.#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.203 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:15:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:57.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:57 np0005596062 systemd[1]: Started libpod-conmon-41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb.scope.
Jan 26 13:15:57 np0005596062 podman[242134]: 2026-01-26 18:15:57.160457488 +0000 UTC m=+0.028540888 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.259 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.264 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.265 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.265 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.266 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.266 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.266 227317 DEBUG nova.virt.libvirt.driver [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.271 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:15:57 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:15:57 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/988d60742055f7903448358d0000d7bcd69e52df2eb4c6294fd495f913f30c12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:15:57 np0005596062 podman[242134]: 2026-01-26 18:15:57.294753832 +0000 UTC m=+0.162837232 container init 41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:15:57 np0005596062 podman[242134]: 2026-01-26 18:15:57.301068389 +0000 UTC m=+0.169151769 container start 41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:15:57 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [NOTICE]   (242153) : New worker (242155) forked
Jan 26 13:15:57 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [NOTICE]   (242153) : Loading success.
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.362 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.406 227317 INFO nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Took 10.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.406 227317 DEBUG nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.514 227317 INFO nova.compute.manager [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Took 12.54 seconds to build instance.#033[00m
Jan 26 13:15:57 np0005596062 nova_compute[227313]: 2026-01-26 18:15:57.544 227317 DEBUG oslo_concurrency.lockutils [None req-cb4d7827-5344-4d50-a247-cdea6307b088 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:15:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:57.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:15:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.053 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.054 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.091 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.092 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.092 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.092 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.093 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:15:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:15:59.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.247 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.584 227317 DEBUG nova.compute.manager [req-6cf70fbc-cebd-45ae-a429-c13fbc021472 req-4435dc2c-d09f-48de-8d1b-5dfec04492d9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.585 227317 DEBUG oslo_concurrency.lockutils [req-6cf70fbc-cebd-45ae-a429-c13fbc021472 req-4435dc2c-d09f-48de-8d1b-5dfec04492d9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.585 227317 DEBUG oslo_concurrency.lockutils [req-6cf70fbc-cebd-45ae-a429-c13fbc021472 req-4435dc2c-d09f-48de-8d1b-5dfec04492d9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.586 227317 DEBUG oslo_concurrency.lockutils [req-6cf70fbc-cebd-45ae-a429-c13fbc021472 req-4435dc2c-d09f-48de-8d1b-5dfec04492d9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.586 227317 DEBUG nova.compute.manager [req-6cf70fbc-cebd-45ae-a429-c13fbc021472 req-4435dc2c-d09f-48de-8d1b-5dfec04492d9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] No waiting events found dispatching network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.586 227317 WARNING nova.compute.manager [req-6cf70fbc-cebd-45ae-a429-c13fbc021472 req-4435dc2c-d09f-48de-8d1b-5dfec04492d9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received unexpected event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 for instance with vm_state active and task_state pausing.#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.616 227317 INFO nova.compute.manager [None req-911a3df5-5142-4f58-a600-ac552b40da03 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Pausing#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.617 227317 DEBUG nova.objects.instance [None req-911a3df5-5142-4f58-a600-ac552b40da03 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lazy-loading 'flavor' on Instance uuid 0ba43746-10bc-41bd-aa56-246af8723901 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.667 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451359.6667151, 0ba43746-10bc-41bd-aa56-246af8723901 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.668 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.670 227317 DEBUG nova.compute.manager [None req-911a3df5-5142-4f58-a600-ac552b40da03 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.740 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.744 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:15:59 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:15:59 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1200367955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:15:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:15:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:15:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:15:59.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.812 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.719s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:15:59 np0005596062 nova_compute[227313]: 2026-01-26 18:15:59.905 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.023 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.023 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.189 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.244 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.245 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4658MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.245 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.246 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.434 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 0ba43746-10bc-41bd-aa56-246af8723901 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.435 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.435 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:16:00 np0005596062 nova_compute[227313]: 2026-01-26 18:16:00.516 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:16:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:16:00 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1376469013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:16:01 np0005596062 nova_compute[227313]: 2026-01-26 18:16:01.004 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:16:01 np0005596062 nova_compute[227313]: 2026-01-26 18:16:01.011 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:16:01 np0005596062 nova_compute[227313]: 2026-01-26 18:16:01.073 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:16:01 np0005596062 nova_compute[227313]: 2026-01-26 18:16:01.171 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:16:01 np0005596062 nova_compute[227313]: 2026-01-26 18:16:01.172 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:01.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:01.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:02 np0005596062 nova_compute[227313]: 2026-01-26 18:16:02.171 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:02 np0005596062 nova_compute[227313]: 2026-01-26 18:16:02.173 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:02 np0005596062 nova_compute[227313]: 2026-01-26 18:16:02.173 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:02 np0005596062 nova_compute[227313]: 2026-01-26 18:16:02.173 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:16:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:03.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:03.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.922 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.923 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.923 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:16:03 np0005596062 nova_compute[227313]: 2026-01-26 18:16:03.923 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0ba43746-10bc-41bd-aa56-246af8723901 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:16:04 np0005596062 nova_compute[227313]: 2026-01-26 18:16:04.250 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:05 np0005596062 nova_compute[227313]: 2026-01-26 18:16:05.191 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:05.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:05 np0005596062 nova_compute[227313]: 2026-01-26 18:16:05.585 227317 DEBUG nova.compute.manager [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:16:05 np0005596062 nova_compute[227313]: 2026-01-26 18:16:05.656 227317 INFO nova.compute.manager [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] instance snapshotting#033[00m
Jan 26 13:16:05 np0005596062 nova_compute[227313]: 2026-01-26 18:16:05.656 227317 WARNING nova.compute.manager [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Jan 26 13:16:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:05.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:06 np0005596062 nova_compute[227313]: 2026-01-26 18:16:06.483 227317 INFO nova.virt.libvirt.driver [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Beginning live snapshot process#033[00m
Jan 26 13:16:06 np0005596062 nova_compute[227313]: 2026-01-26 18:16:06.708 227317 DEBUG nova.virt.libvirt.imagebackend [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] No parent info for 57de5960-c1c5-4cfa-af34-8f58cf25f585; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 26 13:16:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:07.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:07 np0005596062 nova_compute[227313]: 2026-01-26 18:16:07.240 227317 DEBUG nova.storage.rbd_utils [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] creating snapshot(661ff4c8c0ca4fe99b4625173857c099) on rbd image(0ba43746-10bc-41bd-aa56-246af8723901_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:16:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 26 13:16:07 np0005596062 nova_compute[227313]: 2026-01-26 18:16:07.550 227317 DEBUG nova.storage.rbd_utils [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] cloning vms/0ba43746-10bc-41bd-aa56-246af8723901_disk@661ff4c8c0ca4fe99b4625173857c099 to images/b90744db-7278-4916-a68c-fdabcb354c5e clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 26 13:16:07 np0005596062 nova_compute[227313]: 2026-01-26 18:16:07.713 227317 DEBUG nova.storage.rbd_utils [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] flattening images/b90744db-7278-4916-a68c-fdabcb354c5e flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 26 13:16:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:07.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:08 np0005596062 nova_compute[227313]: 2026-01-26 18:16:08.281 227317 DEBUG nova.storage.rbd_utils [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] removing snapshot(661ff4c8c0ca4fe99b4625173857c099) on rbd image(0ba43746-10bc-41bd-aa56-246af8723901_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 26 13:16:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 26 13:16:08 np0005596062 nova_compute[227313]: 2026-01-26 18:16:08.534 227317 DEBUG nova.storage.rbd_utils [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] creating snapshot(snap) on rbd image(b90744db-7278-4916-a68c-fdabcb354c5e) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:16:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:09.165 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:09.166 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:09.167 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:09.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:09 np0005596062 nova_compute[227313]: 2026-01-26 18:16:09.299 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 26 13:16:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:09.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:09 np0005596062 podman[242358]: 2026-01-26 18:16:09.867643066 +0000 UTC m=+0.070906653 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 13:16:10 np0005596062 nova_compute[227313]: 2026-01-26 18:16:10.194 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:11.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:11.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:12 np0005596062 nova_compute[227313]: 2026-01-26 18:16:12.327 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updating instance_info_cache with network_info: [{"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:16:12 np0005596062 nova_compute[227313]: 2026-01-26 18:16:12.469 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-0ba43746-10bc-41bd-aa56-246af8723901" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:16:12 np0005596062 nova_compute[227313]: 2026-01-26 18:16:12.470 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:16:12 np0005596062 nova_compute[227313]: 2026-01-26 18:16:12.470 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:12 np0005596062 nova_compute[227313]: 2026-01-26 18:16:12.470 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:13.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:13 np0005596062 nova_compute[227313]: 2026-01-26 18:16:13.453 227317 INFO nova.virt.libvirt.driver [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Snapshot image upload complete#033[00m
Jan 26 13:16:13 np0005596062 nova_compute[227313]: 2026-01-26 18:16:13.454 227317 INFO nova.compute.manager [None req-97401647-f1b6-4cd4-b3ef-f1f56b83881d 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Took 7.80 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 26 13:16:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:13.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:14.198 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:16:14 np0005596062 nova_compute[227313]: 2026-01-26 18:16:14.198 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:14.199 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:16:14 np0005596062 nova_compute[227313]: 2026-01-26 18:16:14.301 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:15 np0005596062 nova_compute[227313]: 2026-01-26 18:16:15.197 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:15.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:15 np0005596062 nova_compute[227313]: 2026-01-26 18:16:15.464 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:15.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 26 13:16:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:17.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 26 13:16:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:17.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:18 np0005596062 podman[242432]: 2026-01-26 18:16:18.894099434 +0000 UTC m=+0.097746805 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:16:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:19.202 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:16:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:19.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.303 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.704 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.705 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.705 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.705 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.706 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.707 227317 INFO nova.compute.manager [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Terminating instance#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.708 227317 DEBUG nova.compute.manager [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:16:19 np0005596062 kernel: tap082b9c3e-33 (unregistering): left promiscuous mode
Jan 26 13:16:19 np0005596062 NetworkManager[48993]: <info>  [1769451379.7611] device (tap082b9c3e-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:16:19 np0005596062 ovn_controller[133984]: 2026-01-26T18:16:19Z|00113|binding|INFO|Releasing lport 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 from this chassis (sb_readonly=0)
Jan 26 13:16:19 np0005596062 ovn_controller[133984]: 2026-01-26T18:16:19Z|00114|binding|INFO|Setting lport 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 down in Southbound
Jan 26 13:16:19 np0005596062 ovn_controller[133984]: 2026-01-26T18:16:19Z|00115|binding|INFO|Removing iface tap082b9c3e-33 ovn-installed in OVS
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.773 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:19.784 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:2e:64 10.100.0.6'], port_security=['fa:16:3e:a9:2e:64 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0ba43746-10bc-41bd-aa56-246af8723901', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b5b72f4b8394daba5420fe9fc17a7bb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aecc7901-6616-49a6-9a43-16b27d3342ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e18fe23-3bd8-404b-8c53-ea3125fa18eb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=082b9c3e-3355-4d4d-ac73-b0b4fd981f48) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:16:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:19.785 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 082b9c3e-3355-4d4d-ac73-b0b4fd981f48 in datapath ef015cda-1a8b-490a-a7b2-b92e5cef4798 unbound from our chassis#033[00m
Jan 26 13:16:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:19.787 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef015cda-1a8b-490a-a7b2-b92e5cef4798, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:16:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:19.788 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d13c9da1-40ec-4540-9638-8e1de3496734]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:19.789 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798 namespace which is not needed anymore#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.796 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:19 np0005596062 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 26 13:16:19 np0005596062 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Consumed 3.091s CPU time.
Jan 26 13:16:19 np0005596062 systemd-machined[195380]: Machine qemu-11-instance-0000000e terminated.
Jan 26 13:16:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:19.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.935 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.941 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.951 227317 INFO nova.virt.libvirt.driver [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Instance destroyed successfully.#033[00m
Jan 26 13:16:19 np0005596062 nova_compute[227313]: 2026-01-26 18:16:19.952 227317 DEBUG nova.objects.instance [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lazy-loading 'resources' on Instance uuid 0ba43746-10bc-41bd-aa56-246af8723901 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:16:20 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [NOTICE]   (242153) : haproxy version is 2.8.14-c23fe91
Jan 26 13:16:20 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [NOTICE]   (242153) : path to executable is /usr/sbin/haproxy
Jan 26 13:16:20 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [WARNING]  (242153) : Exiting Master process...
Jan 26 13:16:20 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [ALERT]    (242153) : Current worker (242155) exited with code 143 (Terminated)
Jan 26 13:16:20 np0005596062 neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798[242149]: [WARNING]  (242153) : All workers exited. Exiting... (0)
Jan 26 13:16:20 np0005596062 systemd[1]: libpod-41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb.scope: Deactivated successfully.
Jan 26 13:16:20 np0005596062 podman[242481]: 2026-01-26 18:16:20.02311317 +0000 UTC m=+0.121212447 container died 41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.187 227317 DEBUG nova.virt.libvirt.vif [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:15:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-431958299',display_name='tempest-ImagesTestJSON-server-431958299',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-431958299',id=14,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:15:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='0b5b72f4b8394daba5420fe9fc17a7bb',ramdisk_id='',reservation_id='r-al23wrd7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-596605936',owner_user_name='tempest-ImagesTestJSON-596605936-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:16:13Z,user_data=None,user_id='995bf2bf54f64b3490e7b8e751aedb77',uuid=0ba43746-10bc-41bd-aa56-246af8723901,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.188 227317 DEBUG nova.network.os_vif_util [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Converting VIF {"id": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "address": "fa:16:3e:a9:2e:64", "network": {"id": "ef015cda-1a8b-490a-a7b2-b92e5cef4798", "bridge": "br-int", "label": "tempest-ImagesTestJSON-769491526-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b5b72f4b8394daba5420fe9fc17a7bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap082b9c3e-33", "ovs_interfaceid": "082b9c3e-3355-4d4d-ac73-b0b4fd981f48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.189 227317 DEBUG nova.network.os_vif_util [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.189 227317 DEBUG os_vif [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.191 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.191 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082b9c3e-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.192 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.194 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.195 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.197 227317 INFO os_vif [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:2e:64,bridge_name='br-int',has_traffic_filtering=True,id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48,network=Network(ef015cda-1a8b-490a-a7b2-b92e5cef4798),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap082b9c3e-33')#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.214 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:20 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb-userdata-shm.mount: Deactivated successfully.
Jan 26 13:16:20 np0005596062 systemd[1]: var-lib-containers-storage-overlay-988d60742055f7903448358d0000d7bcd69e52df2eb4c6294fd495f913f30c12-merged.mount: Deactivated successfully.
Jan 26 13:16:20 np0005596062 podman[242481]: 2026-01-26 18:16:20.345298239 +0000 UTC m=+0.443397506 container cleanup 41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 13:16:20 np0005596062 systemd[1]: libpod-conmon-41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb.scope: Deactivated successfully.
Jan 26 13:16:20 np0005596062 podman[242540]: 2026-01-26 18:16:20.428740703 +0000 UTC m=+0.056667225 container remove 41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.435 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9c8cc8d7-18b1-48ae-ab46-2d62c1fd103f]: (4, ('Mon Jan 26 06:16:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798 (41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb)\n41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb\nMon Jan 26 06:16:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798 (41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb)\n41589d118edc20d9cb7729c310abc90da3d51f15998f2753d322f8feda781bcb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.437 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[68b37987-8205-4f15-a141-51cf67411aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.438 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef015cda-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.440 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:20 np0005596062 kernel: tapef015cda-10: left promiscuous mode
Jan 26 13:16:20 np0005596062 nova_compute[227313]: 2026-01-26 18:16:20.456 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.460 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[18b80990-3c73-4e64-8009-0bfdd32ad83f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.476 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[8290ead3-e6a7-4fdc-99a8-d15eab36aa15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.478 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[684e8dd9-05a5-42b7-a219-a688a78a6922]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.498 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5ef15d-26ea-4604-a90f-0de88bedeb0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511654, 'reachable_time': 17732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242555, 'error': None, 'target': 'ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.502 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef015cda-1a8b-490a-a7b2-b92e5cef4798 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:16:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:16:20.502 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[8eaa2c90-4cc1-4cb4-83f2-9caea8788c60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:16:20 np0005596062 systemd[1]: run-netns-ovnmeta\x2def015cda\x2d1a8b\x2d490a\x2da7b2\x2db92e5cef4798.mount: Deactivated successfully.
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.189 227317 DEBUG nova.compute.manager [req-a9f5a423-bb4b-494a-b545-eeda9a166ff0 req-b6a022ac-85e1-43df-8d2a-48a969c54813 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-vif-unplugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.190 227317 DEBUG oslo_concurrency.lockutils [req-a9f5a423-bb4b-494a-b545-eeda9a166ff0 req-b6a022ac-85e1-43df-8d2a-48a969c54813 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.190 227317 DEBUG oslo_concurrency.lockutils [req-a9f5a423-bb4b-494a-b545-eeda9a166ff0 req-b6a022ac-85e1-43df-8d2a-48a969c54813 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.190 227317 DEBUG oslo_concurrency.lockutils [req-a9f5a423-bb4b-494a-b545-eeda9a166ff0 req-b6a022ac-85e1-43df-8d2a-48a969c54813 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.191 227317 DEBUG nova.compute.manager [req-a9f5a423-bb4b-494a-b545-eeda9a166ff0 req-b6a022ac-85e1-43df-8d2a-48a969c54813 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] No waiting events found dispatching network-vif-unplugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.191 227317 DEBUG nova.compute.manager [req-a9f5a423-bb4b-494a-b545-eeda9a166ff0 req-b6a022ac-85e1-43df-8d2a-48a969c54813 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-vif-unplugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:16:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:21.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.532 227317 INFO nova.virt.libvirt.driver [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Deleting instance files /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901_del#033[00m
Jan 26 13:16:21 np0005596062 nova_compute[227313]: 2026-01-26 18:16:21.533 227317 INFO nova.virt.libvirt.driver [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Deletion of /var/lib/nova/instances/0ba43746-10bc-41bd-aa56-246af8723901_del complete#033[00m
Jan 26 13:16:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:21.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:22 np0005596062 nova_compute[227313]: 2026-01-26 18:16:22.696 227317 INFO nova.compute.manager [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Took 2.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:16:22 np0005596062 nova_compute[227313]: 2026-01-26 18:16:22.699 227317 DEBUG oslo.service.loopingcall [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:16:22 np0005596062 nova_compute[227313]: 2026-01-26 18:16:22.700 227317 DEBUG nova.compute.manager [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:16:22 np0005596062 nova_compute[227313]: 2026-01-26 18:16:22.700 227317 DEBUG nova.network.neutron [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:16:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 26 13:16:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:23.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:23 np0005596062 nova_compute[227313]: 2026-01-26 18:16:23.405 227317 DEBUG nova.compute.manager [req-ac9f95a2-00c9-4ae2-950d-c00987a4dedf req-0598bcb1-d786-40c9-9c75-a542759e8271 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:16:23 np0005596062 nova_compute[227313]: 2026-01-26 18:16:23.405 227317 DEBUG oslo_concurrency.lockutils [req-ac9f95a2-00c9-4ae2-950d-c00987a4dedf req-0598bcb1-d786-40c9-9c75-a542759e8271 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0ba43746-10bc-41bd-aa56-246af8723901-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:23 np0005596062 nova_compute[227313]: 2026-01-26 18:16:23.405 227317 DEBUG oslo_concurrency.lockutils [req-ac9f95a2-00c9-4ae2-950d-c00987a4dedf req-0598bcb1-d786-40c9-9c75-a542759e8271 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:23 np0005596062 nova_compute[227313]: 2026-01-26 18:16:23.406 227317 DEBUG oslo_concurrency.lockutils [req-ac9f95a2-00c9-4ae2-950d-c00987a4dedf req-0598bcb1-d786-40c9-9c75-a542759e8271 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:23 np0005596062 nova_compute[227313]: 2026-01-26 18:16:23.406 227317 DEBUG nova.compute.manager [req-ac9f95a2-00c9-4ae2-950d-c00987a4dedf req-0598bcb1-d786-40c9-9c75-a542759e8271 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] No waiting events found dispatching network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:16:23 np0005596062 nova_compute[227313]: 2026-01-26 18:16:23.406 227317 WARNING nova.compute.manager [req-ac9f95a2-00c9-4ae2-950d-c00987a4dedf req-0598bcb1-d786-40c9-9c75-a542759e8271 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received unexpected event network-vif-plugged-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 for instance with vm_state paused and task_state deleting.#033[00m
Jan 26 13:16:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:23.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:25 np0005596062 nova_compute[227313]: 2026-01-26 18:16:25.194 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:25 np0005596062 nova_compute[227313]: 2026-01-26 18:16:25.200 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:25.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:25.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:16:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:16:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.062 227317 DEBUG nova.network.neutron [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.067 227317 DEBUG nova.compute.manager [req-b7132abd-0159-4fcd-a55c-271f46bb11be req-bb1315d7-8d17-47fd-8569-5086185bd58c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Received event network-vif-deleted-082b9c3e-3355-4d4d-ac73-b0b4fd981f48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.067 227317 INFO nova.compute.manager [req-b7132abd-0159-4fcd-a55c-271f46bb11be req-bb1315d7-8d17-47fd-8569-5086185bd58c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Neutron deleted interface 082b9c3e-3355-4d4d-ac73-b0b4fd981f48; detaching it from the instance and deleting it from the info cache#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.068 227317 DEBUG nova.network.neutron [req-b7132abd-0159-4fcd-a55c-271f46bb11be req-bb1315d7-8d17-47fd-8569-5086185bd58c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.095 227317 INFO nova.compute.manager [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Took 3.39 seconds to deallocate network for instance.#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.131 227317 DEBUG nova.compute.manager [req-b7132abd-0159-4fcd-a55c-271f46bb11be req-bb1315d7-8d17-47fd-8569-5086185bd58c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Detach interface failed, port_id=082b9c3e-3355-4d4d-ac73-b0b4fd981f48, reason: Instance 0ba43746-10bc-41bd-aa56-246af8723901 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.320 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.321 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.389 227317 DEBUG oslo_concurrency.processutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:16:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:16:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2617893370' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.872 227317 DEBUG oslo_concurrency.processutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.879 227317 DEBUG nova.compute.provider_tree [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:16:26 np0005596062 nova_compute[227313]: 2026-01-26 18:16:26.987 227317 DEBUG nova.scheduler.client.report [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:16:27 np0005596062 nova_compute[227313]: 2026-01-26 18:16:27.068 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:27 np0005596062 nova_compute[227313]: 2026-01-26 18:16:27.142 227317 INFO nova.scheduler.client.report [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Deleted allocations for instance 0ba43746-10bc-41bd-aa56-246af8723901#033[00m
Jan 26 13:16:27 np0005596062 nova_compute[227313]: 2026-01-26 18:16:27.238 227317 DEBUG oslo_concurrency.lockutils [None req-963b6caf-0de3-4622-bdf9-8e302c9b9ad5 995bf2bf54f64b3490e7b8e751aedb77 0b5b72f4b8394daba5420fe9fc17a7bb - - default default] Lock "0ba43746-10bc-41bd-aa56-246af8723901" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:16:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:27.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:27.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:29.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:29.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:30 np0005596062 nova_compute[227313]: 2026-01-26 18:16:30.198 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:30 np0005596062 nova_compute[227313]: 2026-01-26 18:16:30.203 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:31.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:31.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:33.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:33.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:34 np0005596062 nova_compute[227313]: 2026-01-26 18:16:34.950 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451379.9490833, 0ba43746-10bc-41bd-aa56-246af8723901 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:16:34 np0005596062 nova_compute[227313]: 2026-01-26 18:16:34.950 227317 INFO nova.compute.manager [-] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:16:35 np0005596062 nova_compute[227313]: 2026-01-26 18:16:35.189 227317 DEBUG nova.compute.manager [None req-92385625-61a5-40b4-9d78-6dd363f7a2cf - - - - - -] [instance: 0ba43746-10bc-41bd-aa56-246af8723901] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:16:35 np0005596062 nova_compute[227313]: 2026-01-26 18:16:35.203 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:35 np0005596062 nova_compute[227313]: 2026-01-26 18:16:35.205 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:35.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:16:35 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:16:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:35.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:37.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:37.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:39.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:39.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:40 np0005596062 nova_compute[227313]: 2026-01-26 18:16:40.206 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:40 np0005596062 nova_compute[227313]: 2026-01-26 18:16:40.207 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:40 np0005596062 podman[242821]: 2026-01-26 18:16:40.877195604 +0000 UTC m=+0.073361497 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 26 13:16:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:16:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:41.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:16:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:41.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:43.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:43.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:45 np0005596062 nova_compute[227313]: 2026-01-26 18:16:45.209 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:45.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:45.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:47.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:47.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:49 np0005596062 nova_compute[227313]: 2026-01-26 18:16:49.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:16:49 np0005596062 nova_compute[227313]: 2026-01-26 18:16:49.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:16:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:49.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:49.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:49 np0005596062 podman[242846]: 2026-01-26 18:16:49.954221993 +0000 UTC m=+0.158581158 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 26 13:16:50 np0005596062 nova_compute[227313]: 2026-01-26 18:16:50.211 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:16:50 np0005596062 nova_compute[227313]: 2026-01-26 18:16:50.214 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:16:50 np0005596062 nova_compute[227313]: 2026-01-26 18:16:50.214 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 13:16:50 np0005596062 nova_compute[227313]: 2026-01-26 18:16:50.214 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:16:50 np0005596062 nova_compute[227313]: 2026-01-26 18:16:50.253 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:50 np0005596062 nova_compute[227313]: 2026-01-26 18:16:50.254 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:16:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:51.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:53.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:16:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:53.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:16:55 np0005596062 nova_compute[227313]: 2026-01-26 18:16:55.254 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:16:55 np0005596062 nova_compute[227313]: 2026-01-26 18:16:55.256 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:55 np0005596062 nova_compute[227313]: 2026-01-26 18:16:55.256 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 13:16:55 np0005596062 nova_compute[227313]: 2026-01-26 18:16:55.256 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:16:55 np0005596062 nova_compute[227313]: 2026-01-26 18:16:55.257 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:16:55 np0005596062 nova_compute[227313]: 2026-01-26 18:16:55.258 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:55.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:55.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:57.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:57.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:16:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:16:59.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:16:59 np0005596062 nova_compute[227313]: 2026-01-26 18:16:59.833 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:16:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:16:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:16:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:16:59.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.072 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.072 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.073 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.073 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.139 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.140 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.140 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.140 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.140 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.258 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:17:00 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2927316148' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.595 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.771 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.773 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4829MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.773 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.773 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.865 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:17:00 np0005596062 nova_compute[227313]: 2026-01-26 18:17:00.865 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.016 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.157 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.158 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.177 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:17:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:01.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.326 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.347 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:17:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3200679386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.828 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.835 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:17:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:01.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.891 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.928 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.928 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.929 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.929 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.974 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:17:01 np0005596062 nova_compute[227313]: 2026-01-26 18:17:01.975 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.176 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.176 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.196 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.320 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.321 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.337 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.338 227317 INFO nova.compute.claims [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.459 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:17:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2683290014' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.912 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:02 np0005596062 nova_compute[227313]: 2026-01-26 18:17:02.921 227317 DEBUG nova.compute.provider_tree [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:17:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.087 227317 DEBUG nova.scheduler.client.report [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.125 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.126 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.176 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.176 227317 DEBUG nova.network.neutron [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.206 227317 INFO nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.249 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:17:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:03.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.364 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.366 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.367 227317 INFO nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Creating image(s)#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.402 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.437 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.470 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.476 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.559 227317 DEBUG nova.policy [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44f877c47b324a029aacaa24f3dcf0a5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9530c76d02c946779e365ca19e12a603', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.564 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.565 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.565 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.566 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.598 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.604 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 09b69264-452e-4074-ae2c-e2c72688d4f6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:03.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.959 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.960 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:03 np0005596062 nova_compute[227313]: 2026-01-26 18:17:03.961 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:04 np0005596062 nova_compute[227313]: 2026-01-26 18:17:04.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:04.182 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:17:04 np0005596062 nova_compute[227313]: 2026-01-26 18:17:04.183 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:04 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:04.184 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:17:04 np0005596062 nova_compute[227313]: 2026-01-26 18:17:04.684 227317 DEBUG nova.network.neutron [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Successfully created port: 02057853-1efb-4d54-8330-555b9770d46b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.079 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.079 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.256 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 09b69264-452e-4074-ae2c-e2c72688d4f6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.653s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.296 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:05.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.351 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] resizing rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:17:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:05.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.877 227317 DEBUG nova.objects.instance [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lazy-loading 'migration_context' on Instance uuid 09b69264-452e-4074-ae2c-e2c72688d4f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.896 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.897 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Ensure instance console log exists: /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.897 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.898 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.898 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.980 227317 DEBUG nova.network.neutron [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Successfully updated port: 02057853-1efb-4d54-8330-555b9770d46b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.995 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "refresh_cache-09b69264-452e-4074-ae2c-e2c72688d4f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.995 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquired lock "refresh_cache-09b69264-452e-4074-ae2c-e2c72688d4f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:17:05 np0005596062 nova_compute[227313]: 2026-01-26 18:17:05.995 227317 DEBUG nova.network.neutron [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:17:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:06.187 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:06 np0005596062 nova_compute[227313]: 2026-01-26 18:17:06.206 227317 DEBUG nova.compute.manager [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-changed-02057853-1efb-4d54-8330-555b9770d46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:06 np0005596062 nova_compute[227313]: 2026-01-26 18:17:06.207 227317 DEBUG nova.compute.manager [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Refreshing instance network info cache due to event network-changed-02057853-1efb-4d54-8330-555b9770d46b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:17:06 np0005596062 nova_compute[227313]: 2026-01-26 18:17:06.207 227317 DEBUG oslo_concurrency.lockutils [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-09b69264-452e-4074-ae2c-e2c72688d4f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:17:06 np0005596062 nova_compute[227313]: 2026-01-26 18:17:06.336 227317 DEBUG nova.network.neutron [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:17:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:07.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.637 227317 DEBUG nova.network.neutron [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Updating instance_info_cache with network_info: [{"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.657 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Releasing lock "refresh_cache-09b69264-452e-4074-ae2c-e2c72688d4f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.657 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Instance network_info: |[{"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.658 227317 DEBUG oslo_concurrency.lockutils [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-09b69264-452e-4074-ae2c-e2c72688d4f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.659 227317 DEBUG nova.network.neutron [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Refreshing network info cache for port 02057853-1efb-4d54-8330-555b9770d46b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.661 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Start _get_guest_xml network_info=[{"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.667 227317 WARNING nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.671 227317 DEBUG nova.virt.libvirt.host [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.672 227317 DEBUG nova.virt.libvirt.host [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.679 227317 DEBUG nova.virt.libvirt.host [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.680 227317 DEBUG nova.virt.libvirt.host [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.681 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.681 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.682 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.682 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.682 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.682 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.683 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.683 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.683 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.683 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.684 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.684 227317 DEBUG nova.virt.hardware [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:17:07 np0005596062 nova_compute[227313]: 2026-01-26 18:17:07.687 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:07.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:17:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:17:08 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/718288437' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.153 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.214 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.220 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 26 13:17:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:17:08 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2045119595' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.707 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.709 227317 DEBUG nova.virt.libvirt.vif [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-570865210',display_name='tempest-DeleteServersTestJSON-server-570865210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-570865210',id=15,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9530c76d02c946779e365ca19e12a603',ramdisk_id='',reservation_id='r-kdxftci0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-168913508',owner_user_name='tempest-DeleteServersTestJSON-168913508-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:17:03Z,user_data=None,user_id='44f877c47b324a029aacaa24f3dcf0a5',uuid=09b69264-452e-4074-ae2c-e2c72688d4f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.709 227317 DEBUG nova.network.os_vif_util [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Converting VIF {"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.710 227317 DEBUG nova.network.os_vif_util [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.712 227317 DEBUG nova.objects.instance [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 09b69264-452e-4074-ae2c-e2c72688d4f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.739 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <uuid>09b69264-452e-4074-ae2c-e2c72688d4f6</uuid>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <name>instance-0000000f</name>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:name>tempest-DeleteServersTestJSON-server-570865210</nova:name>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:17:07</nova:creationTime>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:user uuid="44f877c47b324a029aacaa24f3dcf0a5">tempest-DeleteServersTestJSON-168913508-project-member</nova:user>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:project uuid="9530c76d02c946779e365ca19e12a603">tempest-DeleteServersTestJSON-168913508</nova:project>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <nova:port uuid="02057853-1efb-4d54-8330-555b9770d46b">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <entry name="serial">09b69264-452e-4074-ae2c-e2c72688d4f6</entry>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <entry name="uuid">09b69264-452e-4074-ae2c-e2c72688d4f6</entry>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/09b69264-452e-4074-ae2c-e2c72688d4f6_disk">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/09b69264-452e-4074-ae2c-e2c72688d4f6_disk.config">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:b5:dc:e0"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <target dev="tap02057853-1e"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/console.log" append="off"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:17:08 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:17:08 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:17:08 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:17:08 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.741 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Preparing to wait for external event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.742 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.742 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.742 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.743 227317 DEBUG nova.virt.libvirt.vif [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-570865210',display_name='tempest-DeleteServersTestJSON-server-570865210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-570865210',id=15,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9530c76d02c946779e365ca19e12a603',ramdisk_id='',reservation_id='r-kdxftci0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-168913508',owner_user_name='tempest-DeleteServersTestJSON-168913508-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:17:03Z,user_data=None,user_id='44f877c47b324a029aacaa24f3dcf0a5',uuid=09b69264-452e-4074-ae2c-e2c72688d4f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.744 227317 DEBUG nova.network.os_vif_util [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Converting VIF {"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.744 227317 DEBUG nova.network.os_vif_util [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.745 227317 DEBUG os_vif [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.745 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.746 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.746 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.751 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.751 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02057853-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.752 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02057853-1e, col_values=(('external_ids', {'iface-id': '02057853-1efb-4d54-8330-555b9770d46b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:dc:e0', 'vm-uuid': '09b69264-452e-4074-ae2c-e2c72688d4f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.753 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:08 np0005596062 NetworkManager[48993]: <info>  [1769451428.7549] manager: (tap02057853-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.757 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.762 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.763 227317 INFO os_vif [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e')#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.826 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.827 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.827 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] No VIF found with MAC fa:16:3e:b5:dc:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.827 227317 INFO nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Using config drive#033[00m
Jan 26 13:17:08 np0005596062 nova_compute[227313]: 2026-01-26 18:17:08.861 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:09.167 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:09.168 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:09.168 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:09.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.550 227317 INFO nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Creating config drive at /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/disk.config#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.556 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcrxfe5js execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.589 227317 DEBUG nova.network.neutron [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Updated VIF entry in instance network info cache for port 02057853-1efb-4d54-8330-555b9770d46b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.590 227317 DEBUG nova.network.neutron [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Updating instance_info_cache with network_info: [{"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.619 227317 DEBUG oslo_concurrency.lockutils [req-d36e5420-cf30-4919-8c73-fed7e08dc68a req-3a954e7b-2033-4d87-86be-b7e2425c434c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-09b69264-452e-4074-ae2c-e2c72688d4f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.696 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcrxfe5js" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.730 227317 DEBUG nova.storage.rbd_utils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] rbd image 09b69264-452e-4074-ae2c-e2c72688d4f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:09 np0005596062 nova_compute[227313]: 2026-01-26 18:17:09.735 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/disk.config 09b69264-452e-4074-ae2c-e2c72688d4f6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:09.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.247 227317 DEBUG oslo_concurrency.processutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/disk.config 09b69264-452e-4074-ae2c-e2c72688d4f6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.248 227317 INFO nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Deleting local config drive /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6/disk.config because it was imported into RBD.#033[00m
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.263 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 kernel: tap02057853-1e: entered promiscuous mode
Jan 26 13:17:10 np0005596062 NetworkManager[48993]: <info>  [1769451430.3294] manager: (tap02057853-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.328 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.335 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:10Z|00116|binding|INFO|Claiming lport 02057853-1efb-4d54-8330-555b9770d46b for this chassis.
Jan 26 13:17:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:10Z|00117|binding|INFO|02057853-1efb-4d54-8330-555b9770d46b: Claiming fa:16:3e:b5:dc:e0 10.100.0.13
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.339 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.355 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:dc:e0 10.100.0.13'], port_security=['fa:16:3e:b5:dc:e0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '09b69264-452e-4074-ae2c-e2c72688d4f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b81ecfff-beb3-4914-aa86-877e7993886f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9530c76d02c946779e365ca19e12a603', 'neutron:revision_number': '2', 'neutron:security_group_ids': '208dc1de-6968-477d-9727-f186b3c3500e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3690b82b-570d-459f-8550-e70dd105d954, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=02057853-1efb-4d54-8330-555b9770d46b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.357 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 02057853-1efb-4d54-8330-555b9770d46b in datapath b81ecfff-beb3-4914-aa86-877e7993886f bound to our chassis#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.358 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b81ecfff-beb3-4914-aa86-877e7993886f#033[00m
Jan 26 13:17:10 np0005596062 systemd-udevd[243301]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:17:10 np0005596062 systemd-machined[195380]: New machine qemu-12-instance-0000000f.
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.372 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cfea4b-1874-4a4f-87ec-bcd1511e7aea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.373 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb81ecfff-b1 in ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.375 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb81ecfff-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.375 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8568bf-0ab6-46e0-8db9-19deca57f07f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.376 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b319ed0b-9ec9-4816-8a74-32a862b5cf43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 NetworkManager[48993]: <info>  [1769451430.3896] device (tap02057853-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:17:10 np0005596062 NetworkManager[48993]: <info>  [1769451430.3905] device (tap02057853-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.395 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d14c5e-db3a-43e4-9bd6-843f69f1c458]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 systemd[1]: Started Virtual Machine qemu-12-instance-0000000f.
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.425 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f16428a4-12e5-4aa4-ae67-b2f8aa1c90f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:10Z|00118|binding|INFO|Setting lport 02057853-1efb-4d54-8330-555b9770d46b ovn-installed in OVS
Jan 26 13:17:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:10Z|00119|binding|INFO|Setting lport 02057853-1efb-4d54-8330-555b9770d46b up in Southbound
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.435 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.437 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.468 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[1e83618e-9942-45fb-96d6-9062f6a06536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 NetworkManager[48993]: <info>  [1769451430.4762] manager: (tapb81ecfff-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.477 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[56fe9cc9-e598-45e1-afc4-90335169a673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.524 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8f66c6-f88f-454d-8625-7374c0d27362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.528 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2afaab-67d8-413e-b8ca-4222f8035d45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 NetworkManager[48993]: <info>  [1769451430.5530] device (tapb81ecfff-b0): carrier: link connected
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.557 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[6b55d9af-767a-4858-9dee-495331665d0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.573 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[699166a9-6182-4f5f-9ec4-61aa3ee91203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb81ecfff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:74:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519060, 'reachable_time': 15736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243334, 'error': None, 'target': 'ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.594 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[05e134b9-6d2d-42c0-8ce9-2c791783672c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:745b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519060, 'tstamp': 519060}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243335, 'error': None, 'target': 'ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.617 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d78c45ee-1f1b-4a82-807d-329ff4e51aa6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb81ecfff-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:74:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519060, 'reachable_time': 15736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243336, 'error': None, 'target': 'ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.654 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0e54fc23-e327-4cd8-b983-28712b2c5799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.729 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5c49b2a9-32aa-4e48-92e5-d77de5131957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.731 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb81ecfff-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.732 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.732 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb81ecfff-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:10 np0005596062 NetworkManager[48993]: <info>  [1769451430.7349] manager: (tapb81ecfff-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.734 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 kernel: tapb81ecfff-b0: entered promiscuous mode
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.739 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.740 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb81ecfff-b0, col_values=(('external_ids', {'iface-id': '3492fc3a-a07b-42f6-8662-eb073abacb06'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:10 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:10Z|00120|binding|INFO|Releasing lport 3492fc3a-a07b-42f6-8662-eb073abacb06 from this chassis (sb_readonly=0)
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.741 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 nova_compute[227313]: 2026-01-26 18:17:10.755 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.756 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b81ecfff-beb3-4914-aa86-877e7993886f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b81ecfff-beb3-4914-aa86-877e7993886f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.757 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5ee213-5fc3-4cc0-8c7d-b72f0fa424af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.758 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-b81ecfff-beb3-4914-aa86-877e7993886f
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/b81ecfff-beb3-4914-aa86-877e7993886f.pid.haproxy
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID b81ecfff-beb3-4914-aa86-877e7993886f
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:17:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:10.758 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f', 'env', 'PROCESS_TAG=haproxy-b81ecfff-beb3-4914-aa86-877e7993886f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b81ecfff-beb3-4914-aa86-877e7993886f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.108 227317 DEBUG nova.compute.manager [req-93ffbe31-e9ad-4258-8c9d-4ceb70d5c955 req-de236719-dfda-4788-91f6-89b71bf33674 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.110 227317 DEBUG oslo_concurrency.lockutils [req-93ffbe31-e9ad-4258-8c9d-4ceb70d5c955 req-de236719-dfda-4788-91f6-89b71bf33674 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.110 227317 DEBUG oslo_concurrency.lockutils [req-93ffbe31-e9ad-4258-8c9d-4ceb70d5c955 req-de236719-dfda-4788-91f6-89b71bf33674 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.110 227317 DEBUG oslo_concurrency.lockutils [req-93ffbe31-e9ad-4258-8c9d-4ceb70d5c955 req-de236719-dfda-4788-91f6-89b71bf33674 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.110 227317 DEBUG nova.compute.manager [req-93ffbe31-e9ad-4258-8c9d-4ceb70d5c955 req-de236719-dfda-4788-91f6-89b71bf33674 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Processing event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:17:11 np0005596062 podman[243409]: 2026-01-26 18:17:11.136428484 +0000 UTC m=+0.058523203 container create fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.150 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.151 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451431.1490862, 09b69264-452e-4074-ae2c-e2c72688d4f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.151 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] VM Started (Lifecycle Event)#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.156 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.160 227317 INFO nova.virt.libvirt.driver [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Instance spawned successfully.#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.160 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:17:11 np0005596062 systemd[1]: Started libpod-conmon-fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e.scope.
Jan 26 13:17:11 np0005596062 podman[243409]: 2026-01-26 18:17:11.104997731 +0000 UTC m=+0.027092470 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:17:11 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.217 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:11 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4b6aa0f1b17c8c900df4e630fac4165e360762d2e485f22517c7103b8441dd5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.227 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:17:11 np0005596062 podman[243409]: 2026-01-26 18:17:11.234432735 +0000 UTC m=+0.156527474 container init fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 13:17:11 np0005596062 podman[243409]: 2026-01-26 18:17:11.242122999 +0000 UTC m=+0.164217718 container start fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:17:11 np0005596062 podman[243423]: 2026-01-26 18:17:11.25687843 +0000 UTC m=+0.066636669 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 13:17:11 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [NOTICE]   (243445) : New worker (243447) forked
Jan 26 13:17:11 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [NOTICE]   (243445) : Loading success.
Jan 26 13:17:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:11.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.327 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.328 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.328 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.328 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.329 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.329 227317 DEBUG nova.virt.libvirt.driver [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.375 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.376 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451431.1494482, 09b69264-452e-4074-ae2c-e2c72688d4f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.376 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.395 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.402 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451431.1554456, 09b69264-452e-4074-ae2c-e2c72688d4f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.402 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.407 227317 INFO nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Took 8.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.407 227317 DEBUG nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.416 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.419 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.443 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.462 227317 INFO nova.compute.manager [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Took 9.21 seconds to build instance.#033[00m
Jan 26 13:17:11 np0005596062 nova_compute[227313]: 2026-01-26 18:17:11.480 227317 DEBUG oslo_concurrency.lockutils [None req-8a677326-294f-4bdf-a3ec-1aad5761fc0a 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:11.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:13.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.350 227317 DEBUG nova.compute.manager [req-6ca12a02-c2e8-4191-8db2-5c582f8950c7 req-067faeb4-7d52-434d-9fb9-38370203769f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.350 227317 DEBUG oslo_concurrency.lockutils [req-6ca12a02-c2e8-4191-8db2-5c582f8950c7 req-067faeb4-7d52-434d-9fb9-38370203769f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.351 227317 DEBUG oslo_concurrency.lockutils [req-6ca12a02-c2e8-4191-8db2-5c582f8950c7 req-067faeb4-7d52-434d-9fb9-38370203769f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.351 227317 DEBUG oslo_concurrency.lockutils [req-6ca12a02-c2e8-4191-8db2-5c582f8950c7 req-067faeb4-7d52-434d-9fb9-38370203769f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.351 227317 DEBUG nova.compute.manager [req-6ca12a02-c2e8-4191-8db2-5c582f8950c7 req-067faeb4-7d52-434d-9fb9-38370203769f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] No waiting events found dispatching network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.351 227317 WARNING nova.compute.manager [req-6ca12a02-c2e8-4191-8db2-5c582f8950c7 req-067faeb4-7d52-434d-9fb9-38370203769f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received unexpected event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b for instance with vm_state active and task_state None.#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.754 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:13.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.961 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.961 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.962 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.962 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.962 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.964 227317 INFO nova.compute.manager [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Terminating instance#033[00m
Jan 26 13:17:13 np0005596062 nova_compute[227313]: 2026-01-26 18:17:13.965 227317 DEBUG nova.compute.manager [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:17:14 np0005596062 kernel: tap02057853-1e (unregistering): left promiscuous mode
Jan 26 13:17:14 np0005596062 NetworkManager[48993]: <info>  [1769451434.0134] device (tap02057853-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.029 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:14Z|00121|binding|INFO|Releasing lport 02057853-1efb-4d54-8330-555b9770d46b from this chassis (sb_readonly=0)
Jan 26 13:17:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:14Z|00122|binding|INFO|Setting lport 02057853-1efb-4d54-8330-555b9770d46b down in Southbound
Jan 26 13:17:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:14Z|00123|binding|INFO|Removing iface tap02057853-1e ovn-installed in OVS
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.039 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:dc:e0 10.100.0.13'], port_security=['fa:16:3e:b5:dc:e0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '09b69264-452e-4074-ae2c-e2c72688d4f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b81ecfff-beb3-4914-aa86-877e7993886f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9530c76d02c946779e365ca19e12a603', 'neutron:revision_number': '4', 'neutron:security_group_ids': '208dc1de-6968-477d-9727-f186b3c3500e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3690b82b-570d-459f-8550-e70dd105d954, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=02057853-1efb-4d54-8330-555b9770d46b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.040 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 02057853-1efb-4d54-8330-555b9770d46b in datapath b81ecfff-beb3-4914-aa86-877e7993886f unbound from our chassis#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.041 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b81ecfff-beb3-4914-aa86-877e7993886f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.042 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a2003821-2b84-400d-ad41-6e401e8e6b77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.042 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f namespace which is not needed anymore#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.051 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 26 13:17:14 np0005596062 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000f.scope: Consumed 3.551s CPU time.
Jan 26 13:17:14 np0005596062 systemd-machined[195380]: Machine qemu-12-instance-0000000f terminated.
Jan 26 13:17:14 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [NOTICE]   (243445) : haproxy version is 2.8.14-c23fe91
Jan 26 13:17:14 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [NOTICE]   (243445) : path to executable is /usr/sbin/haproxy
Jan 26 13:17:14 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [WARNING]  (243445) : Exiting Master process...
Jan 26 13:17:14 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [ALERT]    (243445) : Current worker (243447) exited with code 143 (Terminated)
Jan 26 13:17:14 np0005596062 neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f[243426]: [WARNING]  (243445) : All workers exited. Exiting... (0)
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.195 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 systemd[1]: libpod-fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e.scope: Deactivated successfully.
Jan 26 13:17:14 np0005596062 conmon[243426]: conmon fec3942f0636af7af81c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e.scope/container/memory.events
Jan 26 13:17:14 np0005596062 podman[243531]: 2026-01-26 18:17:14.202594448 +0000 UTC m=+0.052117964 container died fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.203 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.221 227317 INFO nova.virt.libvirt.driver [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Instance destroyed successfully.#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.221 227317 DEBUG nova.objects.instance [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lazy-loading 'resources' on Instance uuid 09b69264-452e-4074-ae2c-e2c72688d4f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.237 227317 DEBUG nova.virt.libvirt.vif [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-570865210',display_name='tempest-DeleteServersTestJSON-server-570865210',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-570865210',id=15,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:17:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9530c76d02c946779e365ca19e12a603',ramdisk_id='',reservation_id='r-kdxftci0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-168913508',owner_user_name='tempest-DeleteServersTestJSON-168913508-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:17:11Z,user_data=None,user_id='44f877c47b324a029aacaa24f3dcf0a5',uuid=09b69264-452e-4074-ae2c-e2c72688d4f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.238 227317 DEBUG nova.network.os_vif_util [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Converting VIF {"id": "02057853-1efb-4d54-8330-555b9770d46b", "address": "fa:16:3e:b5:dc:e0", "network": {"id": "b81ecfff-beb3-4914-aa86-877e7993886f", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-522956968-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9530c76d02c946779e365ca19e12a603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02057853-1e", "ovs_interfaceid": "02057853-1efb-4d54-8330-555b9770d46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.239 227317 DEBUG nova.network.os_vif_util [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.240 227317 DEBUG os_vif [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:17:14 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e-userdata-shm.mount: Deactivated successfully.
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.242 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.242 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02057853-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:14 np0005596062 systemd[1]: var-lib-containers-storage-overlay-c4b6aa0f1b17c8c900df4e630fac4165e360762d2e485f22517c7103b8441dd5-merged.mount: Deactivated successfully.
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.246 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:17:14 np0005596062 podman[243531]: 2026-01-26 18:17:14.249534873 +0000 UTC m=+0.099058389 container cleanup fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.249 227317 INFO os_vif [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:dc:e0,bridge_name='br-int',has_traffic_filtering=True,id=02057853-1efb-4d54-8330-555b9770d46b,network=Network(b81ecfff-beb3-4914-aa86-877e7993886f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02057853-1e')#033[00m
Jan 26 13:17:14 np0005596062 systemd[1]: libpod-conmon-fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e.scope: Deactivated successfully.
Jan 26 13:17:14 np0005596062 podman[243574]: 2026-01-26 18:17:14.314886937 +0000 UTC m=+0.042621922 container remove fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.322 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[31293ebf-b154-4124-8703-dfaaeebd690c]: (4, ('Mon Jan 26 06:17:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f (fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e)\nfec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e\nMon Jan 26 06:17:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f (fec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e)\nfec3942f0636af7af81c2653ced753ab59976ecba70ca8f9f67a1ef17f75906e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.324 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d77ee030-1eff-4683-93ca-2bc705bae57c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.325 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb81ecfff-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.327 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 kernel: tapb81ecfff-b0: left promiscuous mode
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.342 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.345 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f90a6dba-063d-45fb-88dd-aa82ca109ca0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.360 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ba889d93-4dcd-4571-9159-7a19989b635b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.361 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b35311f2-7b7f-4275-969c-4517aade01f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.381 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[475eb879-2caf-4bb3-bb03-3fb182b289d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519051, 'reachable_time': 43571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243600, 'error': None, 'target': 'ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.385 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b81ecfff-beb3-4914-aa86-877e7993886f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:17:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:14.385 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8cb963-7851-4a21-9660-41433983d208]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:14 np0005596062 systemd[1]: run-netns-ovnmeta\x2db81ecfff\x2dbeb3\x2d4914\x2daa86\x2d877e7993886f.mount: Deactivated successfully.
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.719 227317 INFO nova.virt.libvirt.driver [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Deleting instance files /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6_del#033[00m
Jan 26 13:17:14 np0005596062 nova_compute[227313]: 2026-01-26 18:17:14.720 227317 INFO nova.virt.libvirt.driver [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Deletion of /var/lib/nova/instances/09b69264-452e-4074-ae2c-e2c72688d4f6_del complete#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.168 227317 INFO nova.compute.manager [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Took 1.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.169 227317 DEBUG oslo.service.loopingcall [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.170 227317 DEBUG nova.compute.manager [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.170 227317 DEBUG nova.network.neutron [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.265 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:15.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.855 227317 DEBUG nova.compute.manager [req-68edb599-54a6-49db-9391-242d980c3938 req-99fa0a9b-de42-4bc7-9222-f9a5513399cf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-vif-unplugged-02057853-1efb-4d54-8330-555b9770d46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.856 227317 DEBUG oslo_concurrency.lockutils [req-68edb599-54a6-49db-9391-242d980c3938 req-99fa0a9b-de42-4bc7-9222-f9a5513399cf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.856 227317 DEBUG oslo_concurrency.lockutils [req-68edb599-54a6-49db-9391-242d980c3938 req-99fa0a9b-de42-4bc7-9222-f9a5513399cf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.857 227317 DEBUG oslo_concurrency.lockutils [req-68edb599-54a6-49db-9391-242d980c3938 req-99fa0a9b-de42-4bc7-9222-f9a5513399cf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.857 227317 DEBUG nova.compute.manager [req-68edb599-54a6-49db-9391-242d980c3938 req-99fa0a9b-de42-4bc7-9222-f9a5513399cf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] No waiting events found dispatching network-vif-unplugged-02057853-1efb-4d54-8330-555b9770d46b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:17:15 np0005596062 nova_compute[227313]: 2026-01-26 18:17:15.858 227317 DEBUG nova.compute.manager [req-68edb599-54a6-49db-9391-242d980c3938 req-99fa0a9b-de42-4bc7-9222-f9a5513399cf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-vif-unplugged-02057853-1efb-4d54-8330-555b9770d46b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:17:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:15.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:17.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.388 227317 DEBUG nova.network.neutron [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.518 227317 DEBUG nova.compute.manager [req-3e9324a5-9015-4264-a9f4-95a4126a9160 req-5ce17180-e6d7-4356-ae91-f87fffbbe59a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-vif-deleted-02057853-1efb-4d54-8330-555b9770d46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.518 227317 INFO nova.compute.manager [req-3e9324a5-9015-4264-a9f4-95a4126a9160 req-5ce17180-e6d7-4356-ae91-f87fffbbe59a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Neutron deleted interface 02057853-1efb-4d54-8330-555b9770d46b; detaching it from the instance and deleting it from the info cache#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.518 227317 DEBUG nova.network.neutron [req-3e9324a5-9015-4264-a9f4-95a4126a9160 req-5ce17180-e6d7-4356-ae91-f87fffbbe59a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.539 227317 INFO nova.compute.manager [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Took 2.37 seconds to deallocate network for instance.#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.546 227317 DEBUG nova.compute.manager [req-3e9324a5-9015-4264-a9f4-95a4126a9160 req-5ce17180-e6d7-4356-ae91-f87fffbbe59a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Detach interface failed, port_id=02057853-1efb-4d54-8330-555b9770d46b, reason: Instance 09b69264-452e-4074-ae2c-e2c72688d4f6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.641 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.641 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:17 np0005596062 nova_compute[227313]: 2026-01-26 18:17:17.703 227317 DEBUG oslo_concurrency.processutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:17.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:17:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1283345947' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.195 227317 DEBUG oslo_concurrency.processutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.205 227317 DEBUG nova.compute.provider_tree [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.390 227317 DEBUG nova.compute.manager [req-5e9baa7f-6885-4b9a-b5f4-9c1e9cdb3f9c req-594cb287-27cc-4372-9649-6be8da405376 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.390 227317 DEBUG oslo_concurrency.lockutils [req-5e9baa7f-6885-4b9a-b5f4-9c1e9cdb3f9c req-594cb287-27cc-4372-9649-6be8da405376 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.390 227317 DEBUG oslo_concurrency.lockutils [req-5e9baa7f-6885-4b9a-b5f4-9c1e9cdb3f9c req-594cb287-27cc-4372-9649-6be8da405376 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.390 227317 DEBUG oslo_concurrency.lockutils [req-5e9baa7f-6885-4b9a-b5f4-9c1e9cdb3f9c req-594cb287-27cc-4372-9649-6be8da405376 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.391 227317 DEBUG nova.compute.manager [req-5e9baa7f-6885-4b9a-b5f4-9c1e9cdb3f9c req-594cb287-27cc-4372-9649-6be8da405376 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] No waiting events found dispatching network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.391 227317 WARNING nova.compute.manager [req-5e9baa7f-6885-4b9a-b5f4-9c1e9cdb3f9c req-594cb287-27cc-4372-9649-6be8da405376 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Received unexpected event network-vif-plugged-02057853-1efb-4d54-8330-555b9770d46b for instance with vm_state deleted and task_state None.#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.398 227317 DEBUG nova.scheduler.client.report [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.431 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:18 np0005596062 nova_compute[227313]: 2026-01-26 18:17:18.489 227317 INFO nova.scheduler.client.report [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Deleted allocations for instance 09b69264-452e-4074-ae2c-e2c72688d4f6#033[00m
Jan 26 13:17:19 np0005596062 nova_compute[227313]: 2026-01-26 18:17:19.077 227317 DEBUG oslo_concurrency.lockutils [None req-7e2d868f-c872-4e2b-9bc4-ba16c8feadb9 44f877c47b324a029aacaa24f3dcf0a5 9530c76d02c946779e365ca19e12a603 - - default default] Lock "09b69264-452e-4074-ae2c-e2c72688d4f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:19 np0005596062 nova_compute[227313]: 2026-01-26 18:17:19.246 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:19.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:19.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:20 np0005596062 nova_compute[227313]: 2026-01-26 18:17:20.310 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:20 np0005596062 podman[243627]: 2026-01-26 18:17:20.877407889 +0000 UTC m=+0.086799764 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 13:17:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:21.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:21 np0005596062 nova_compute[227313]: 2026-01-26 18:17:21.718 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:21 np0005596062 nova_compute[227313]: 2026-01-26 18:17:21.718 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:21 np0005596062 nova_compute[227313]: 2026-01-26 18:17:21.807 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:17:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:21.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:22 np0005596062 nova_compute[227313]: 2026-01-26 18:17:22.072 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:22 np0005596062 nova_compute[227313]: 2026-01-26 18:17:22.073 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:22 np0005596062 nova_compute[227313]: 2026-01-26 18:17:22.083 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:17:22 np0005596062 nova_compute[227313]: 2026-01-26 18:17:22.083 227317 INFO nova.compute.claims [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:17:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:23.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:23 np0005596062 nova_compute[227313]: 2026-01-26 18:17:23.399 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:17:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1951405566' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:17:23 np0005596062 nova_compute[227313]: 2026-01-26 18:17:23.875 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:23 np0005596062 nova_compute[227313]: 2026-01-26 18:17:23.884 227317 DEBUG nova.compute.provider_tree [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:17:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:23.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:24 np0005596062 nova_compute[227313]: 2026-01-26 18:17:24.249 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:24 np0005596062 nova_compute[227313]: 2026-01-26 18:17:24.504 227317 DEBUG nova.scheduler.client.report [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:17:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:25.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:25 np0005596062 nova_compute[227313]: 2026-01-26 18:17:25.361 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:25.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:27.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:27.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:28 np0005596062 nova_compute[227313]: 2026-01-26 18:17:28.524 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 6.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:28 np0005596062 nova_compute[227313]: 2026-01-26 18:17:28.525 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.221 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451434.2190592, 09b69264-452e-4074-ae2c-e2c72688d4f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.221 227317 INFO nova.compute.manager [-] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.253 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:29.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.495 227317 DEBUG nova.compute.manager [None req-ce4ba4e5-9ce6-466c-8d55-f951e0252791 - - - - - -] [instance: 09b69264-452e-4074-ae2c-e2c72688d4f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.505 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.505 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.534 227317 INFO nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.554 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.653 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.657 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.658 227317 INFO nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Creating image(s)#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.697 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.726 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.751 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.755 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "845aad0744c07ae3a06850747475706fc56a381e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.756 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "845aad0744c07ae3a06850747475706fc56a381e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:29 np0005596062 nova_compute[227313]: 2026-01-26 18:17:29.760 227317 DEBUG nova.policy [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '01ca8f689d9d42eeb8784ceda61c0f3a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bcdb364505d445798cd4757dac03bd74', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:17:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:17:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:29.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:17:30 np0005596062 nova_compute[227313]: 2026-01-26 18:17:30.013 227317 DEBUG nova.virt.libvirt.imagebackend [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Image locations are: [{'url': 'rbd://d4cd1917-5876-51b6-bc64-65a16199754d/images/be7b1750-5d13-441e-bf97-67d885906c42/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://d4cd1917-5876-51b6-bc64-65a16199754d/images/be7b1750-5d13-441e-bf97-67d885906c42/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 26 13:17:30 np0005596062 nova_compute[227313]: 2026-01-26 18:17:30.363 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:30 np0005596062 nova_compute[227313]: 2026-01-26 18:17:30.953 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Successfully created port: eeccbbe5-aa0e-40af-a243-188f272a475e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.195 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.280 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.346 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.part --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.348 227317 DEBUG nova.virt.images [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] be7b1750-5d13-441e-bf97-67d885906c42 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.349 227317 DEBUG nova.privsep.utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.350 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.part /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:31.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.562 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.part /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.converted" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.567 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.660 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e.converted --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.663 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "845aad0744c07ae3a06850747475706fc56a381e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.699 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.704 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e 11feb864-0940-4546-8dee-7b4f295c60fa_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.762 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Successfully updated port: eeccbbe5-aa0e-40af-a243-188f272a475e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.773 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.774 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquired lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.774 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:17:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:31.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.918 227317 DEBUG nova.compute.manager [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received event network-changed-eeccbbe5-aa0e-40af-a243-188f272a475e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.918 227317 DEBUG nova.compute.manager [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Refreshing instance network info cache due to event network-changed-eeccbbe5-aa0e-40af-a243-188f272a475e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:17:31 np0005596062 nova_compute[227313]: 2026-01-26 18:17:31.919 227317 DEBUG oslo_concurrency.lockutils [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:17:32 np0005596062 nova_compute[227313]: 2026-01-26 18:17:32.010 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:17:32 np0005596062 nova_compute[227313]: 2026-01-26 18:17:32.911 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Updating instance_info_cache with network_info: [{"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:32 np0005596062 nova_compute[227313]: 2026-01-26 18:17:32.943 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Releasing lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:17:32 np0005596062 nova_compute[227313]: 2026-01-26 18:17:32.944 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance network_info: |[{"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:17:32 np0005596062 nova_compute[227313]: 2026-01-26 18:17:32.946 227317 DEBUG oslo_concurrency.lockutils [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:17:32 np0005596062 nova_compute[227313]: 2026-01-26 18:17:32.946 227317 DEBUG nova.network.neutron [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Refreshing network info cache for port eeccbbe5-aa0e-40af-a243-188f272a475e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:17:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:33.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:33.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.023 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e 11feb864-0940-4546-8dee-7b4f295c60fa_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.108 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] resizing rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.228 227317 DEBUG nova.objects.instance [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lazy-loading 'migration_context' on Instance uuid 11feb864-0940-4546-8dee-7b4f295c60fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:17:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.253 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.253 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Ensure instance console log exists: /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.254 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.255 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.255 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.257 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Start _get_guest_xml network_info=[{"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:29Z,direct_url=<?>,disk_format='qcow2',id=be7b1750-5d13-441e-bf97-67d885906c42,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:32Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': 'be7b1750-5d13-441e-bf97-67d885906c42'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.258 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.263 227317 WARNING nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.268 227317 DEBUG nova.virt.libvirt.host [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.268 227317 DEBUG nova.virt.libvirt.host [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.272 227317 DEBUG nova.virt.libvirt.host [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.272 227317 DEBUG nova.virt.libvirt.host [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.274 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.274 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:29Z,direct_url=<?>,disk_format='qcow2',id=be7b1750-5d13-441e-bf97-67d885906c42,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:32Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.275 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.275 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.275 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.275 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.276 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.276 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.276 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.277 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.277 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.277 227317 DEBUG nova.virt.hardware [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.280 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.559 227317 DEBUG nova.network.neutron [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Updated VIF entry in instance network info cache for port eeccbbe5-aa0e-40af-a243-188f272a475e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.560 227317 DEBUG nova.network.neutron [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Updating instance_info_cache with network_info: [{"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.585 227317 DEBUG oslo_concurrency.lockutils [req-1a305b6e-1b15-4af0-b8dc-33612e309247 req-f77582c4-a0ce-41c6-a518-3796c3b9805e 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:17:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:17:34 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2549238458' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.759 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.788 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:34 np0005596062 nova_compute[227313]: 2026-01-26 18:17:34.794 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:35 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:17:35 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2571783268' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:17:35 np0005596062 nova_compute[227313]: 2026-01-26 18:17:35.260 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:35 np0005596062 nova_compute[227313]: 2026-01-26 18:17:35.264 227317 DEBUG nova.virt.libvirt.vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:17:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1254891366',display_name='tempest-ListServerFiltersTestJSON-instance-1254891366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1254891366',id=16,image_ref='be7b1750-5d13-441e-bf97-67d885906c42',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bcdb364505d445798cd4757dac03bd74',ramdisk_id='',reservation_id='r-czv06y4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='be7b1750-5d13-441e-bf97-67d885906c42',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-124247066',owner_user_name='tempest-ListServerFiltersTestJSON-124247066-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:17:29Z,user_data=None,user_id='01ca8f689d9d42eeb8784ceda61c0f3a',uuid=11feb864-0940-4546-8dee-7b4f295c60fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:17:35 np0005596062 nova_compute[227313]: 2026-01-26 18:17:35.264 227317 DEBUG nova.network.os_vif_util [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Converting VIF {"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:17:35 np0005596062 nova_compute[227313]: 2026-01-26 18:17:35.265 227317 DEBUG nova.network.os_vif_util [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:17:35 np0005596062 nova_compute[227313]: 2026-01-26 18:17:35.267 227317 DEBUG nova.objects.instance [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11feb864-0940-4546-8dee-7b4f295c60fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:17:35 np0005596062 nova_compute[227313]: 2026-01-26 18:17:35.365 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:35.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:35.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.345 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <uuid>11feb864-0940-4546-8dee-7b4f295c60fa</uuid>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <name>instance-00000010</name>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1254891366</nova:name>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:17:34</nova:creationTime>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:user uuid="01ca8f689d9d42eeb8784ceda61c0f3a">tempest-ListServerFiltersTestJSON-124247066-project-member</nova:user>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:project uuid="bcdb364505d445798cd4757dac03bd74">tempest-ListServerFiltersTestJSON-124247066</nova:project>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="be7b1750-5d13-441e-bf97-67d885906c42"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <nova:port uuid="eeccbbe5-aa0e-40af-a243-188f272a475e">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <entry name="serial">11feb864-0940-4546-8dee-7b4f295c60fa</entry>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <entry name="uuid">11feb864-0940-4546-8dee-7b4f295c60fa</entry>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/11feb864-0940-4546-8dee-7b4f295c60fa_disk">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/11feb864-0940-4546-8dee-7b4f295c60fa_disk.config">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:f9:13:eb"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <target dev="tapeeccbbe5-aa"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/console.log" append="off"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:17:37 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:17:37 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:17:37 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:17:37 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.346 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Preparing to wait for external event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.346 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.346 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.347 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.347 227317 DEBUG nova.virt.libvirt.vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:17:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1254891366',display_name='tempest-ListServerFiltersTestJSON-instance-1254891366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1254891366',id=16,image_ref='be7b1750-5d13-441e-bf97-67d885906c42',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bcdb364505d445798cd4757dac03bd74',ramdisk_id='',reservation_id='r-czv06y4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='be7b1750-5d13-441e-bf97-67d885906c42',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-124247066',owner_user_name='tempest-ListServerFiltersTestJSON-124247066-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:17:29Z,user_data=None,user_id='01ca8f689d9d42eeb8784ceda61c0f3a',uuid=11feb864-0940-4546-8dee-7b4f295c60fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.347 227317 DEBUG nova.network.os_vif_util [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Converting VIF {"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.348 227317 DEBUG nova.network.os_vif_util [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.348 227317 DEBUG os_vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.349 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.349 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.349 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.353 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.353 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeeccbbe5-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.353 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeeccbbe5-aa, col_values=(('external_ids', {'iface-id': 'eeccbbe5-aa0e-40af-a243-188f272a475e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:13:eb', 'vm-uuid': '11feb864-0940-4546-8dee-7b4f295c60fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.355 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:37 np0005596062 NetworkManager[48993]: <info>  [1769451457.3566] manager: (tapeeccbbe5-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.357 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.363 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.366 227317 INFO os_vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa')#033[00m
Jan 26 13:17:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:37.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.440 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.441 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.441 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] No VIF found with MAC fa:16:3e:f9:13:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.442 227317 INFO nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Using config drive#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.482 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.788 227317 INFO nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Creating config drive at /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/disk.config#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.793 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa9a_hqut execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:37.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.947 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa9a_hqut" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.983 227317 DEBUG nova.storage.rbd_utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] rbd image 11feb864-0940-4546-8dee-7b4f295c60fa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:17:37 np0005596062 nova_compute[227313]: 2026-01-26 18:17:37.988 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/disk.config 11feb864-0940-4546-8dee-7b4f295c60fa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:17:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:39.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:39.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:40 np0005596062 nova_compute[227313]: 2026-01-26 18:17:40.405 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:17:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:17:40 np0005596062 nova_compute[227313]: 2026-01-26 18:17:40.812 227317 DEBUG oslo_concurrency.processutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/disk.config 11feb864-0940-4546-8dee-7b4f295c60fa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.824s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:40 np0005596062 nova_compute[227313]: 2026-01-26 18:17:40.813 227317 INFO nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Deleting local config drive /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa/disk.config because it was imported into RBD.#033[00m
Jan 26 13:17:40 np0005596062 kernel: tapeeccbbe5-aa: entered promiscuous mode
Jan 26 13:17:40 np0005596062 NetworkManager[48993]: <info>  [1769451460.8852] manager: (tapeeccbbe5-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 26 13:17:40 np0005596062 nova_compute[227313]: 2026-01-26 18:17:40.886 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:40Z|00124|binding|INFO|Claiming lport eeccbbe5-aa0e-40af-a243-188f272a475e for this chassis.
Jan 26 13:17:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:40Z|00125|binding|INFO|eeccbbe5-aa0e-40af-a243-188f272a475e: Claiming fa:16:3e:f9:13:eb 10.100.0.3
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.897 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:13:eb 10.100.0.3'], port_security=['fa:16:3e:f9:13:eb 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '11feb864-0940-4546-8dee-7b4f295c60fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-529a7708-48ad-42c7-812a-c022180b51d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bcdb364505d445798cd4757dac03bd74', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f73619d2-4dd8-41cf-b7b3-ef22067e1011', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61dcea34-d0d3-49b0-8032-03e5eff3d2d4, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=eeccbbe5-aa0e-40af-a243-188f272a475e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.898 143929 INFO neutron.agent.ovn.metadata.agent [-] Port eeccbbe5-aa0e-40af-a243-188f272a475e in datapath 529a7708-48ad-42c7-812a-c022180b51d2 bound to our chassis#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.900 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 529a7708-48ad-42c7-812a-c022180b51d2#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.913 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[72e076c4-367a-462c-a32d-ac252d18b260]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.914 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap529a7708-41 in ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.917 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap529a7708-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.917 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[05414db4-5472-4e43-88f4-40ee75c6f642]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.919 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[56d31a7a-fa13-412e-9d4e-aef268dfa3d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:40 np0005596062 systemd-machined[195380]: New machine qemu-13-instance-00000010.
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.934 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[8b611ce0-7322-41ad-a221-9b1c32c5a53f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:40 np0005596062 nova_compute[227313]: 2026-01-26 18:17:40.940 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:40 np0005596062 systemd[1]: Started Virtual Machine qemu-13-instance-00000010.
Jan 26 13:17:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:40Z|00126|binding|INFO|Setting lport eeccbbe5-aa0e-40af-a243-188f272a475e ovn-installed in OVS
Jan 26 13:17:40 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:40Z|00127|binding|INFO|Setting lport eeccbbe5-aa0e-40af-a243-188f272a475e up in Southbound
Jan 26 13:17:40 np0005596062 nova_compute[227313]: 2026-01-26 18:17:40.946 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:40 np0005596062 systemd-udevd[244182]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.961 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a647da-67de-430e-a472-bebb5aafcffe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:40 np0005596062 NetworkManager[48993]: <info>  [1769451460.9682] device (tapeeccbbe5-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:17:40 np0005596062 NetworkManager[48993]: <info>  [1769451460.9688] device (tapeeccbbe5-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:17:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:40.996 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[385be0a6-dccc-43a1-8a87-3331285205fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 NetworkManager[48993]: <info>  [1769451461.0024] manager: (tap529a7708-40): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.001 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[16e69209-b895-4a46-81b3-ccf1ddda3dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 systemd-udevd[244185]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.038 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[9b363a8b-5d3d-43ea-a859-2ada7742a4dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.043 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[6c968d07-c111-456c-bd9e-63d90515202e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 NetworkManager[48993]: <info>  [1769451461.0725] device (tap529a7708-40): carrier: link connected
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.082 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[59b2b6f2-410f-4883-8f58-92047430fd74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.100 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[968eead6-4e6d-4f26-9867-dc94abe2fb4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap529a7708-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:cd:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522112, 'reachable_time': 41094, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244213, 'error': None, 'target': 'ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.119 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6d755a10-c3b9-4c31-93a3-748cba2bb5cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:cd32'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522112, 'tstamp': 522112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244214, 'error': None, 'target': 'ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.140 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[891e87ed-175e-4ec3-89c4-167fda567b03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap529a7708-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:cd:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522112, 'reachable_time': 41094, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244215, 'error': None, 'target': 'ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.178 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[80da1762-7a01-4f08-a0e6-2ed9e667b3b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.254 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ca36e1ae-48d7-42eb-8308-7627a9a26f00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.256 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap529a7708-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.256 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.257 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap529a7708-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.259 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 NetworkManager[48993]: <info>  [1769451461.2604] manager: (tap529a7708-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 26 13:17:41 np0005596062 kernel: tap529a7708-40: entered promiscuous mode
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.262 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.265 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap529a7708-40, col_values=(('external_ids', {'iface-id': 'e4695359-fa25-45e3-9b1f-18b164adcb0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.267 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:41Z|00128|binding|INFO|Releasing lport e4695359-fa25-45e3-9b1f-18b164adcb0a from this chassis (sb_readonly=0)
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.296 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.298 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/529a7708-48ad-42c7-812a-c022180b51d2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/529a7708-48ad-42c7-812a-c022180b51d2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.300 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a830856f-353e-40b7-841e-e8be6418bb92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.301 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-529a7708-48ad-42c7-812a-c022180b51d2
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/529a7708-48ad-42c7-812a-c022180b51d2.pid.haproxy
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 529a7708-48ad-42c7-812a-c022180b51d2
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.302 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2', 'env', 'PROCESS_TAG=haproxy-529a7708-48ad-42c7-812a-c022180b51d2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/529a7708-48ad-42c7-812a-c022180b51d2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.379 227317 DEBUG nova.compute.manager [req-55f9188a-94af-4a27-8cbe-6041e456d2b0 req-6829f8ad-6780-46c3-993f-ec7d81f1ddfb 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.380 227317 DEBUG oslo_concurrency.lockutils [req-55f9188a-94af-4a27-8cbe-6041e456d2b0 req-6829f8ad-6780-46c3-993f-ec7d81f1ddfb 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.380 227317 DEBUG oslo_concurrency.lockutils [req-55f9188a-94af-4a27-8cbe-6041e456d2b0 req-6829f8ad-6780-46c3-993f-ec7d81f1ddfb 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.380 227317 DEBUG oslo_concurrency.lockutils [req-55f9188a-94af-4a27-8cbe-6041e456d2b0 req-6829f8ad-6780-46c3-993f-ec7d81f1ddfb 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.381 227317 DEBUG nova.compute.manager [req-55f9188a-94af-4a27-8cbe-6041e456d2b0 req-6829f8ad-6780-46c3-993f-ec7d81f1ddfb 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Processing event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:17:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:41.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.431 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451461.4311595, 11feb864-0940-4546-8dee-7b4f295c60fa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.432 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] VM Started (Lifecycle Event)#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.435 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.439 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.443 227317 INFO nova.virt.libvirt.driver [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance spawned successfully.#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.443 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.452 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.458 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.464 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.465 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.466 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.466 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.467 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.467 227317 DEBUG nova.virt.libvirt.driver [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.493 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.494 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451461.4322217, 11feb864-0940-4546-8dee-7b4f295c60fa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.495 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.518 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.522 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769451461.439014, 11feb864-0940-4546-8dee-7b4f295c60fa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.523 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.531 227317 INFO nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Took 11.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.531 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.539 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.542 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.569 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.578 227317 DEBUG nova.compute.utils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Conflict updating instance 11feb864-0940-4546-8dee-7b4f295c60fa. Expected: {'task_state': ['spawning']}. Actual: {'task_state': 'deleting'} notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.579 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance disappeared during build. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2483#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.580 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.580 227317 DEBUG nova.virt.libvirt.vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:17:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1254891366',display_name='tempest-ListServerFiltersTestJSON-instance-1254891366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1254891366',id=16,image_ref='be7b1750-5d13-441e-bf97-67d885906c42',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=2026-01-26T18:17:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='bcdb364505d445798cd4757dac03bd74',ramdisk_id='',reservation_id='r-czv06y4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='be7b1750-5d13-441e-bf97-67d885906c42',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-124247066',owner_user_name='tempest-ListServerFiltersTestJSON-124247066-project-member'},tags=TagList,task_state=None,terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:17:31Z,user_data=None,user_id='01ca8f689d9d42eeb8784ceda61c0f3a',uuid=11feb864-0940-4546-8dee-7b4f295c60fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.581 227317 DEBUG nova.network.os_vif_util [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Converting VIF {"id": "eeccbbe5-aa0e-40af-a243-188f272a475e", "address": "fa:16:3e:f9:13:eb", "network": {"id": "529a7708-48ad-42c7-812a-c022180b51d2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-634818565-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bcdb364505d445798cd4757dac03bd74", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeeccbbe5-aa", "ovs_interfaceid": "eeccbbe5-aa0e-40af-a243-188f272a475e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.582 227317 DEBUG nova.network.os_vif_util [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.582 227317 DEBUG os_vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.586 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.587 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeeccbbe5-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.589 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:41Z|00129|binding|INFO|Releasing lport eeccbbe5-aa0e-40af-a243-188f272a475e from this chassis (sb_readonly=0)
Jan 26 13:17:41 np0005596062 ovn_controller[133984]: 2026-01-26T18:17:41Z|00130|binding|INFO|Setting lport eeccbbe5-aa0e-40af-a243-188f272a475e down in Southbound
Jan 26 13:17:41 np0005596062 NetworkManager[48993]: <info>  [1769451461.5911] device (tapeeccbbe5-aa): state change: disconnected -> unmanaged (reason 'unmanaged-external-down', managed-type: 'external')
Jan 26 13:17:41 np0005596062 kernel: tapeeccbbe5-aa: left promiscuous mode
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.591 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.595 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:13:eb 10.100.0.3'], port_security=['fa:16:3e:f9:13:eb 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '11feb864-0940-4546-8dee-7b4f295c60fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-529a7708-48ad-42c7-812a-c022180b51d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bcdb364505d445798cd4757dac03bd74', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f73619d2-4dd8-41cf-b7b3-ef22067e1011', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61dcea34-d0d3-49b0-8032-03e5eff3d2d4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=eeccbbe5-aa0e-40af-a243-188f272a475e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.602 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.605 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.621 227317 INFO os_vif [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:13:eb,bridge_name='br-int',has_traffic_filtering=True,id=eeccbbe5-aa0e-40af-a243-188f272a475e,network=Network(529a7708-48ad-42c7-812a-c022180b51d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeeccbbe5-aa')#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.622 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.622 227317 DEBUG nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:17:41 np0005596062 nova_compute[227313]: 2026-01-26 18:17:41.623 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:17:41 np0005596062 podman[244296]: 2026-01-26 18:17:41.727394356 +0000 UTC m=+0.052766801 container create f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 13:17:41 np0005596062 podman[244281]: 2026-01-26 18:17:41.744760537 +0000 UTC m=+0.095355161 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:17:41 np0005596062 systemd[1]: Started libpod-conmon-f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916.scope.
Jan 26 13:17:41 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:17:41 np0005596062 podman[244296]: 2026-01-26 18:17:41.699070004 +0000 UTC m=+0.024442469 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:17:41 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7878efec084517b641b45cb8e879e25a48a51c3e92ed607a0d3ee29d3290aecb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:17:41 np0005596062 podman[244296]: 2026-01-26 18:17:41.817276881 +0000 UTC m=+0.142649326 container init f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 26 13:17:41 np0005596062 podman[244296]: 2026-01-26 18:17:41.83684681 +0000 UTC m=+0.162219255 container start f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:17:41 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [NOTICE]   (244327) : New worker (244329) forked
Jan 26 13:17:41 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [NOTICE]   (244327) : Loading success.
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.914 143929 INFO neutron.agent.ovn.metadata.agent [-] Port eeccbbe5-aa0e-40af-a243-188f272a475e in datapath 529a7708-48ad-42c7-812a-c022180b51d2 unbound from our chassis#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.916 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 529a7708-48ad-42c7-812a-c022180b51d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.917 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[344508fd-2fa3-491c-83cd-714682290a0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:41.918 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2 namespace which is not needed anymore#033[00m
Jan 26 13:17:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:41.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:42 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [NOTICE]   (244327) : haproxy version is 2.8.14-c23fe91
Jan 26 13:17:42 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [NOTICE]   (244327) : path to executable is /usr/sbin/haproxy
Jan 26 13:17:42 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [WARNING]  (244327) : Exiting Master process...
Jan 26 13:17:42 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [ALERT]    (244327) : Current worker (244329) exited with code 143 (Terminated)
Jan 26 13:17:42 np0005596062 neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2[244323]: [WARNING]  (244327) : All workers exited. Exiting... (0)
Jan 26 13:17:42 np0005596062 systemd[1]: libpod-f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916.scope: Deactivated successfully.
Jan 26 13:17:42 np0005596062 podman[244355]: 2026-01-26 18:17:42.086831033 +0000 UTC m=+0.062468389 container died f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:17:42 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916-userdata-shm.mount: Deactivated successfully.
Jan 26 13:17:42 np0005596062 systemd[1]: var-lib-containers-storage-overlay-7878efec084517b641b45cb8e879e25a48a51c3e92ed607a0d3ee29d3290aecb-merged.mount: Deactivated successfully.
Jan 26 13:17:42 np0005596062 podman[244355]: 2026-01-26 18:17:42.132526335 +0000 UTC m=+0.108163691 container cleanup f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 13:17:42 np0005596062 systemd[1]: libpod-conmon-f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916.scope: Deactivated successfully.
Jan 26 13:17:42 np0005596062 podman[244387]: 2026-01-26 18:17:42.222849242 +0000 UTC m=+0.061550384 container remove f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.231 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f6807e3a-cfef-45e9-9916-0ec5bb4d3be8]: (4, ('Mon Jan 26 06:17:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2 (f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916)\nf21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916\nMon Jan 26 06:17:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2 (f21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916)\nf21959d3a7baedee5a2838c8f0cbcf68a3c466dcd474413771ef748547c24916\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.234 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d28c5f-0d83-48d1-91e7-b7a01550556a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.236 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap529a7708-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:42 np0005596062 kernel: tap529a7708-40: left promiscuous mode
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.242 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.248 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[83a072bc-c630-4593-9ec6-720750c5099a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.257 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.274 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[888449b7-46a9-461a-8f76-25b3aaf5be18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.276 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[207abe66-3313-469f-8e47-9f54bc973e6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.299 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed9f655-82dd-4fff-a1db-8163c1063440]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522104, 'reachable_time': 44476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244402, 'error': None, 'target': 'ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 systemd[1]: run-netns-ovnmeta\x2d529a7708\x2d48ad\x2d42c7\x2d812a\x2dc022180b51d2.mount: Deactivated successfully.
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.307 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-529a7708-48ad-42c7-812a-c022180b51d2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:17:42 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:42.307 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[73f2edb0-ae30-49ae-8862-a0eb36a34604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.616 227317 DEBUG nova.network.neutron [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.637 227317 INFO nova.compute.manager [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Took 1.01 seconds to deallocate network for instance.#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.666 227317 DEBUG nova.compute.manager [req-7d0f87c3-21d0-4e24-8beb-41e662a85a9f req-d7fb6b1b-de64-4b9b-8276-e0963c3b7bb8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received event network-vif-deleted-eeccbbe5-aa0e-40af-a243-188f272a475e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.755 227317 INFO nova.scheduler.client.report [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Deleted allocations for instance 11feb864-0940-4546-8dee-7b4f295c60fa#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.757 227317 DEBUG oslo_concurrency.lockutils [None req-9b3fd859-d6c4-4417-8d8c-1b1f0607dd7d 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.758 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 11.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.759 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.760 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.761 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.763 227317 INFO nova.compute.manager [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Terminating instance#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.765 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.766 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquired lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:17:42 np0005596062 nova_compute[227313]: 2026-01-26 18:17:42.767 227317 DEBUG nova.network.neutron [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:17:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.101 227317 DEBUG nova.network.neutron [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:17:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 26 13:17:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:43.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.500 227317 DEBUG nova.compute.manager [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.501 227317 DEBUG oslo_concurrency.lockutils [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.501 227317 DEBUG oslo_concurrency.lockutils [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.501 227317 DEBUG oslo_concurrency.lockutils [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.502 227317 DEBUG nova.compute.manager [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] No waiting events found dispatching network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.502 227317 WARNING nova.compute.manager [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received unexpected event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e for instance with vm_state active and task_state None.#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.502 227317 DEBUG nova.compute.manager [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.503 227317 DEBUG oslo_concurrency.lockutils [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.503 227317 DEBUG oslo_concurrency.lockutils [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.504 227317 DEBUG oslo_concurrency.lockutils [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.504 227317 DEBUG nova.compute.manager [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] No waiting events found dispatching network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.504 227317 WARNING nova.compute.manager [req-0bf9fb3a-94ee-40d1-bdb1-b164d600b1b9 req-802a7d3a-3e5a-48d9-b3c5-cc07f0324872 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Received unexpected event network-vif-plugged-eeccbbe5-aa0e-40af-a243-188f272a475e for instance with vm_state active and task_state None.#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.524 227317 DEBUG nova.network.neutron [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.550 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Releasing lock "refresh_cache-11feb864-0940-4546-8dee-7b4f295c60fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.551 227317 DEBUG nova.compute.manager [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:17:43 np0005596062 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 26 13:17:43 np0005596062 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Consumed 2.604s CPU time.
Jan 26 13:17:43 np0005596062 systemd-machined[195380]: Machine qemu-13-instance-00000010 terminated.
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.802 227317 INFO nova.virt.libvirt.driver [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance destroyed successfully.#033[00m
Jan 26 13:17:43 np0005596062 nova_compute[227313]: 2026-01-26 18:17:43.803 227317 DEBUG nova.objects.instance [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lazy-loading 'resources' on Instance uuid 11feb864-0940-4546-8dee-7b4f295c60fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:17:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:43.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:45.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.407 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.413 227317 INFO nova.virt.libvirt.driver [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Deleting instance files /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa_del#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.413 227317 INFO nova.virt.libvirt.driver [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Deletion of /var/lib/nova/instances/11feb864-0940-4546-8dee-7b4f295c60fa_del complete#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.481 227317 INFO nova.compute.manager [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Took 1.93 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.482 227317 DEBUG oslo.service.loopingcall [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.483 227317 DEBUG nova.compute.manager [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.483 227317 DEBUG nova.network.neutron [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.656 227317 DEBUG nova.network.neutron [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.671 227317 DEBUG nova.network.neutron [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.699 227317 INFO nova.compute.manager [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Took 0.22 seconds to deallocate network for instance.#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.744 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.745 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:17:45 np0005596062 nova_compute[227313]: 2026-01-26 18:17:45.777 227317 DEBUG oslo_concurrency.processutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:17:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:45.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:17:46 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1161344841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:17:46 np0005596062 nova_compute[227313]: 2026-01-26 18:17:46.233 227317 DEBUG oslo_concurrency.processutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:17:46 np0005596062 nova_compute[227313]: 2026-01-26 18:17:46.242 227317 DEBUG nova.compute.provider_tree [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:17:46 np0005596062 nova_compute[227313]: 2026-01-26 18:17:46.275 227317 DEBUG nova.scheduler.client.report [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:17:46 np0005596062 nova_compute[227313]: 2026-01-26 18:17:46.304 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:46 np0005596062 nova_compute[227313]: 2026-01-26 18:17:46.395 227317 DEBUG oslo_concurrency.lockutils [None req-cab19c3f-4300-4d09-adfd-6d3997cda4e7 01ca8f689d9d42eeb8784ceda61c0f3a bcdb364505d445798cd4757dac03bd74 - - default default] Lock "11feb864-0940-4546-8dee-7b4f295c60fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:17:46 np0005596062 nova_compute[227313]: 2026-01-26 18:17:46.589 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:17:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:17:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:47.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:47.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:48 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:48.273 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:17:48 np0005596062 nova_compute[227313]: 2026-01-26 18:17:48.274 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:48 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:48.275 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:17:48 np0005596062 nova_compute[227313]: 2026-01-26 18:17:48.877 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:49.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:49.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:50 np0005596062 nova_compute[227313]: 2026-01-26 18:17:50.408 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:51.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:51 np0005596062 nova_compute[227313]: 2026-01-26 18:17:51.592 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:51 np0005596062 podman[244508]: 2026-01-26 18:17:51.926523876 +0000 UTC m=+0.131126270 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 13:17:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:51.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:52 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:17:52.277 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:17:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:53.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:53.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:55.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:55 np0005596062 nova_compute[227313]: 2026-01-26 18:17:55.449 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:55.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:56 np0005596062 nova_compute[227313]: 2026-01-26 18:17:56.624 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:17:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:17:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:57.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:17:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:17:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:57.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:17:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:17:58 np0005596062 nova_compute[227313]: 2026-01-26 18:17:58.800 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769451463.7995229, 11feb864-0940-4546-8dee-7b4f295c60fa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:17:58 np0005596062 nova_compute[227313]: 2026-01-26 18:17:58.801 227317 INFO nova.compute.manager [-] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:17:58 np0005596062 nova_compute[227313]: 2026-01-26 18:17:58.844 227317 DEBUG nova.compute.manager [None req-ffa296ed-5516-4312-bb99-cb85e372d8db - - - - - -] [instance: 11feb864-0940-4546-8dee-7b4f295c60fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:17:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:17:59.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:17:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:17:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:17:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:17:59.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:00 np0005596062 nova_compute[227313]: 2026-01-26 18:18:00.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:00 np0005596062 nova_compute[227313]: 2026-01-26 18:18:00.452 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.286 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.286 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.287 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.287 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.287 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:18:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:01.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.671 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:18:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/288936531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.756 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:18:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:18:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:01.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:18:01 np0005596062 nova_compute[227313]: 2026-01-26 18:18:01.998 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.001 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4835MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.001 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.001 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.122 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.123 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.144 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:18:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:18:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2969904656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.616 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.623 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.737 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.773 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:18:02 np0005596062 nova_compute[227313]: 2026-01-26 18:18:02.775 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:18:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:03.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:03 np0005596062 nova_compute[227313]: 2026-01-26 18:18:03.771 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:03 np0005596062 nova_compute[227313]: 2026-01-26 18:18:03.772 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:03.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:04 np0005596062 nova_compute[227313]: 2026-01-26 18:18:04.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:05 np0005596062 nova_compute[227313]: 2026-01-26 18:18:05.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:05 np0005596062 nova_compute[227313]: 2026-01-26 18:18:05.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:18:05 np0005596062 nova_compute[227313]: 2026-01-26 18:18:05.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:18:05 np0005596062 nova_compute[227313]: 2026-01-26 18:18:05.074 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:18:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:05.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:05 np0005596062 nova_compute[227313]: 2026-01-26 18:18:05.454 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:18:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:05.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:18:06 np0005596062 nova_compute[227313]: 2026-01-26 18:18:06.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:06 np0005596062 nova_compute[227313]: 2026-01-26 18:18:06.079 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:06 np0005596062 nova_compute[227313]: 2026-01-26 18:18:06.674 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:07.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:07.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:09 np0005596062 nova_compute[227313]: 2026-01-26 18:18:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:18:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:18:09.168 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:18:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:18:09.169 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:18:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:18:09.169 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:18:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:09.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:09.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:10 np0005596062 nova_compute[227313]: 2026-01-26 18:18:10.456 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:11.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:11 np0005596062 nova_compute[227313]: 2026-01-26 18:18:11.678 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:11 np0005596062 podman[244642]: 2026-01-26 18:18:11.851662015 +0000 UTC m=+0.056068479 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:18:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:11.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:13.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:13.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:15.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:15 np0005596062 nova_compute[227313]: 2026-01-26 18:18:15.503 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:15.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:16 np0005596062 nova_compute[227313]: 2026-01-26 18:18:16.734 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:17.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:17.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:19.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:19.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:20 np0005596062 nova_compute[227313]: 2026-01-26 18:18:20.506 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:21.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:21 np0005596062 nova_compute[227313]: 2026-01-26 18:18:21.739 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:21.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:22 np0005596062 podman[244718]: 2026-01-26 18:18:22.892669768 +0000 UTC m=+0.103601540 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:18:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:23.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:23.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:25.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:25 np0005596062 nova_compute[227313]: 2026-01-26 18:18:25.508 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:25.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:26 np0005596062 nova_compute[227313]: 2026-01-26 18:18:26.795 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:27.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:27.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:29.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:29.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:30 np0005596062 nova_compute[227313]: 2026-01-26 18:18:30.511 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:31.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.491064) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451511491122, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2209, "num_deletes": 255, "total_data_size": 5220042, "memory_usage": 5288064, "flush_reason": "Manual Compaction"}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451511513488, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3419225, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28873, "largest_seqno": 31077, "table_properties": {"data_size": 3410174, "index_size": 5673, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18812, "raw_average_key_size": 20, "raw_value_size": 3392033, "raw_average_value_size": 3707, "num_data_blocks": 248, "num_entries": 915, "num_filter_entries": 915, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451318, "oldest_key_time": 1769451318, "file_creation_time": 1769451511, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 22517 microseconds, and 8731 cpu microseconds.
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.513571) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3419225 bytes OK
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.513611) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.515879) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.515904) EVENT_LOG_v1 {"time_micros": 1769451511515896, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.515937) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5210300, prev total WAL file size 5210300, number of live WAL files 2.
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.518375) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3339KB)], [57(7853KB)]
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451511518475, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 11460959, "oldest_snapshot_seqno": -1}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5415 keys, 9458517 bytes, temperature: kUnknown
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451511583590, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 9458517, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9421497, "index_size": 22382, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 138279, "raw_average_key_size": 25, "raw_value_size": 9322894, "raw_average_value_size": 1721, "num_data_blocks": 905, "num_entries": 5415, "num_filter_entries": 5415, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451511, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.584190) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 9458517 bytes
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.586101) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.3 rd, 144.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 7.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(6.1) write-amplify(2.8) OK, records in: 5940, records dropped: 525 output_compression: NoCompression
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.586128) EVENT_LOG_v1 {"time_micros": 1769451511586116, "job": 34, "event": "compaction_finished", "compaction_time_micros": 65379, "compaction_time_cpu_micros": 30561, "output_level": 6, "num_output_files": 1, "total_output_size": 9458517, "num_input_records": 5940, "num_output_records": 5415, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451511587005, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451511589197, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.518120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.589281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.589287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.589290) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.589292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:18:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:18:31.589294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:18:31 np0005596062 nova_compute[227313]: 2026-01-26 18:18:31.798 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:31.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:33.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:33.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:18:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:35.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:18:35 np0005596062 nova_compute[227313]: 2026-01-26 18:18:35.513 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:35.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:36 np0005596062 nova_compute[227313]: 2026-01-26 18:18:36.800 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:37.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:37 np0005596062 ovn_controller[133984]: 2026-01-26T18:18:37Z|00131|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 13:18:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:37.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:39.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:18:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:40.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:18:40 np0005596062 nova_compute[227313]: 2026-01-26 18:18:40.516 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:41.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:41 np0005596062 nova_compute[227313]: 2026-01-26 18:18:41.804 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:42.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:42 np0005596062 podman[244806]: 2026-01-26 18:18:42.841660092 +0000 UTC m=+0.055070513 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 26 13:18:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:43.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:18:43.595 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:18:43 np0005596062 nova_compute[227313]: 2026-01-26 18:18:43.595 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:18:43.598 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:18:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:44.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:45.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:45 np0005596062 nova_compute[227313]: 2026-01-26 18:18:45.518 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:18:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:46.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:18:46 np0005596062 nova_compute[227313]: 2026-01-26 18:18:46.843 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:48.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:48.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:18:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:18:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:18:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:18:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:50.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:18:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:50.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:18:50 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:18:50 np0005596062 nova_compute[227313]: 2026-01-26 18:18:50.520 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:18:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:18:51 np0005596062 nova_compute[227313]: 2026-01-26 18:18:51.897 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:52.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:52.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:18:53.600 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:18:53 np0005596062 podman[245014]: 2026-01-26 18:18:53.954652092 +0000 UTC m=+0.145680824 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:18:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:54.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:54.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:55 np0005596062 nova_compute[227313]: 2026-01-26 18:18:55.525 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:56.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:18:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:56.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:18:56 np0005596062 nova_compute[227313]: 2026-01-26 18:18:56.900 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:18:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:18:58.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:18:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:18:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:18:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:18:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:18:58.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:00.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:00.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:00 np0005596062 nova_compute[227313]: 2026-01-26 18:19:00.527 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:19:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.080 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.081 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.081 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:19:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:19:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/771405532' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.515 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:19:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.736 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.738 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4832MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.738 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.739 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.827 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.828 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.846 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:19:01 np0005596062 nova_compute[227313]: 2026-01-26 18:19:01.943 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:02.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:19:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2269540053' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:19:02 np0005596062 nova_compute[227313]: 2026-01-26 18:19:02.359 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:19:02 np0005596062 nova_compute[227313]: 2026-01-26 18:19:02.367 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:19:02 np0005596062 nova_compute[227313]: 2026-01-26 18:19:02.419 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:19:02 np0005596062 nova_compute[227313]: 2026-01-26 18:19:02.424 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:19:02 np0005596062 nova_compute[227313]: 2026-01-26 18:19:02.425 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:19:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:03 np0005596062 nova_compute[227313]: 2026-01-26 18:19:03.426 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:03 np0005596062 nova_compute[227313]: 2026-01-26 18:19:03.427 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:03 np0005596062 nova_compute[227313]: 2026-01-26 18:19:03.427 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:19:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:04.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:04 np0005596062 nova_compute[227313]: 2026-01-26 18:19:04.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:04.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:05 np0005596062 nova_compute[227313]: 2026-01-26 18:19:05.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:05 np0005596062 nova_compute[227313]: 2026-01-26 18:19:05.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:19:05 np0005596062 nova_compute[227313]: 2026-01-26 18:19:05.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:19:05 np0005596062 nova_compute[227313]: 2026-01-26 18:19:05.529 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 26 13:19:05 np0005596062 nova_compute[227313]: 2026-01-26 18:19:05.766 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:19:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:06 np0005596062 nova_compute[227313]: 2026-01-26 18:19:06.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:06.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:06 np0005596062 nova_compute[227313]: 2026-01-26 18:19:06.947 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:07 np0005596062 nova_compute[227313]: 2026-01-26 18:19:07.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 26 13:19:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:08.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:08.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:09 np0005596062 nova_compute[227313]: 2026-01-26 18:19:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:19:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:19:09.168 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:19:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:19:09.169 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:19:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:19:09.169 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:19:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:10.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 13:19:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:10.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 13:19:10 np0005596062 nova_compute[227313]: 2026-01-26 18:19:10.533 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:11 np0005596062 nova_compute[227313]: 2026-01-26 18:19:11.949 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:12.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:12.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:13 np0005596062 podman[245195]: 2026-01-26 18:19:13.862802955 +0000 UTC m=+0.067608948 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 13:19:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:14.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:14.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 26 13:19:15 np0005596062 nova_compute[227313]: 2026-01-26 18:19:15.534 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:16.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:16.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:16 np0005596062 nova_compute[227313]: 2026-01-26 18:19:16.960 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 26 13:19:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:18.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:18.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 26 13:19:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:20.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:20.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:20 np0005596062 nova_compute[227313]: 2026-01-26 18:19:20.536 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 26 13:19:21 np0005596062 nova_compute[227313]: 2026-01-26 18:19:21.970 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:22.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:22.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:24.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:24.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:24 np0005596062 nova_compute[227313]: 2026-01-26 18:19:24.431 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:19:24.431 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:19:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:19:24.434 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:19:24 np0005596062 podman[245218]: 2026-01-26 18:19:24.902229482 +0000 UTC m=+0.103203479 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:19:25 np0005596062 nova_compute[227313]: 2026-01-26 18:19:25.539 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:26.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:26.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:26 np0005596062 nova_compute[227313]: 2026-01-26 18:19:26.972 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:28.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:28.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 26 13:19:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:30.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:30.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:30 np0005596062 nova_compute[227313]: 2026-01-26 18:19:30.547 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:31 np0005596062 nova_compute[227313]: 2026-01-26 18:19:31.976 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:32.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:32.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:33 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:19:33.438 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:19:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:34.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:34.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:35 np0005596062 nova_compute[227313]: 2026-01-26 18:19:35.551 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:36.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:36.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:36 np0005596062 nova_compute[227313]: 2026-01-26 18:19:36.979 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:38.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:38.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:40.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:40.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:40 np0005596062 nova_compute[227313]: 2026-01-26 18:19:40.553 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:41 np0005596062 nova_compute[227313]: 2026-01-26 18:19:41.982 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:42.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:42.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:44.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:44.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:44 np0005596062 podman[245307]: 2026-01-26 18:19:44.878174715 +0000 UTC m=+0.087823638 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:19:45 np0005596062 nova_compute[227313]: 2026-01-26 18:19:45.555 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:46.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:46.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:46 np0005596062 nova_compute[227313]: 2026-01-26 18:19:46.985 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:19:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:19:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:48.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:50.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:50.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:50 np0005596062 nova_compute[227313]: 2026-01-26 18:19:50.559 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:51 np0005596062 nova_compute[227313]: 2026-01-26 18:19:51.987 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:52.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:52.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:54.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:54.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:55 np0005596062 nova_compute[227313]: 2026-01-26 18:19:55.561 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:55 np0005596062 podman[245383]: 2026-01-26 18:19:55.887115708 +0000 UTC m=+0.094859126 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Jan 26 13:19:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:56.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:56.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:19:56 np0005596062 nova_compute[227313]: 2026-01-26 18:19:56.990 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:19:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:19:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:19:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:19:58.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:19:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:19:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:19:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:19:58.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:00.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:00.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:00 np0005596062 nova_compute[227313]: 2026-01-26 18:20:00.564 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:00 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.095 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.096 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.096 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:20:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:20:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/911724708' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.561 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.750 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.751 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4827MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.751 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.751 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:20:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:20:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:20:01 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.995 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:20:01 np0005596062 nova_compute[227313]: 2026-01-26 18:20:01.996 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:20:02 np0005596062 nova_compute[227313]: 2026-01-26 18:20:02.000 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:02 np0005596062 nova_compute[227313]: 2026-01-26 18:20:02.066 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:20:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:02.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:02.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:20:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/999105623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:20:02 np0005596062 nova_compute[227313]: 2026-01-26 18:20:02.523 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:20:02 np0005596062 nova_compute[227313]: 2026-01-26 18:20:02.531 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:20:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:03 np0005596062 nova_compute[227313]: 2026-01-26 18:20:03.088 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:20:03 np0005596062 nova_compute[227313]: 2026-01-26 18:20:03.089 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:20:03 np0005596062 nova_compute[227313]: 2026-01-26 18:20:03.090 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:20:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:04.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:04.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:05 np0005596062 nova_compute[227313]: 2026-01-26 18:20:05.566 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.085 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.086 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:06.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.132 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.132 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.132 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.154 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.155 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.156 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:06 np0005596062 nova_compute[227313]: 2026-01-26 18:20:06.156 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:20:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:06.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:07 np0005596062 nova_compute[227313]: 2026-01-26 18:20:07.046 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:07 np0005596062 nova_compute[227313]: 2026-01-26 18:20:07.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:08 np0005596062 nova_compute[227313]: 2026-01-26 18:20:08.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 13:20:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:08.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 13:20:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:08.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:09 np0005596062 nova_compute[227313]: 2026-01-26 18:20:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:20:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:20:09.170 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:20:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:20:09.170 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:20:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:20:09.171 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:20:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:10.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:10.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:10 np0005596062 nova_compute[227313]: 2026-01-26 18:20:10.568 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:20:10 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:20:12 np0005596062 nova_compute[227313]: 2026-01-26 18:20:12.049 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:12.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:12.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:14.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:14.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:15 np0005596062 nova_compute[227313]: 2026-01-26 18:20:15.570 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:15 np0005596062 podman[245694]: 2026-01-26 18:20:15.867480889 +0000 UTC m=+0.067494125 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:20:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:16.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:16.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:17 np0005596062 nova_compute[227313]: 2026-01-26 18:20:17.052 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:18.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:18.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:20.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:20.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:20 np0005596062 nova_compute[227313]: 2026-01-26 18:20:20.572 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:22 np0005596062 nova_compute[227313]: 2026-01-26 18:20:22.105 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:22.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:22.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:24.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:24.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:25 np0005596062 nova_compute[227313]: 2026-01-26 18:20:25.574 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:26.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:26.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:26 np0005596062 podman[245719]: 2026-01-26 18:20:26.974047531 +0000 UTC m=+0.177079644 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:20:27 np0005596062 nova_compute[227313]: 2026-01-26 18:20:27.107 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:28.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:28.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:20:29.707 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:20:29 np0005596062 nova_compute[227313]: 2026-01-26 18:20:29.708 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:20:29.709 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:20:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:30.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:30.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:30 np0005596062 nova_compute[227313]: 2026-01-26 18:20:30.576 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:32 np0005596062 nova_compute[227313]: 2026-01-26 18:20:32.111 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:32.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:32.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:34.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:34.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:35 np0005596062 nova_compute[227313]: 2026-01-26 18:20:35.578 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:35 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:20:35.712 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:20:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:36.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:36.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:37 np0005596062 nova_compute[227313]: 2026-01-26 18:20:37.114 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:38.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:38.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:40.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:40.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:40 np0005596062 nova_compute[227313]: 2026-01-26 18:20:40.580 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:42.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:42 np0005596062 nova_compute[227313]: 2026-01-26 18:20:42.168 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:42.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:44.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:44.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:45 np0005596062 nova_compute[227313]: 2026-01-26 18:20:45.591 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:46.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:46.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:46 np0005596062 podman[245805]: 2026-01-26 18:20:46.848843872 +0000 UTC m=+0.054378314 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 13:20:47 np0005596062 nova_compute[227313]: 2026-01-26 18:20:47.172 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:48.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:48.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:50.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:50.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:50 np0005596062 nova_compute[227313]: 2026-01-26 18:20:50.594 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:52 np0005596062 nova_compute[227313]: 2026-01-26 18:20:52.175 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:52.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:20:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:52.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:20:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:54.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:54.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:55 np0005596062 nova_compute[227313]: 2026-01-26 18:20:55.638 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:56.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:56.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:20:57 np0005596062 nova_compute[227313]: 2026-01-26 18:20:57.179 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:20:57 np0005596062 podman[245881]: 2026-01-26 18:20:57.946860446 +0000 UTC m=+0.145602631 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 13:20:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:20:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:20:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:20:58.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:20:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:20:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:20:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:20:58.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:00.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:00.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:00 np0005596062 nova_compute[227313]: 2026-01-26 18:21:00.739 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.091 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.092 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.092 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.092 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.092 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.182 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:02.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:02.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:21:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3091245719' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.575 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.744 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.746 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4822MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.746 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.746 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.867 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.868 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:21:02 np0005596062 nova_compute[227313]: 2026-01-26 18:21:02.931 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:21:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:21:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3002186492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:21:03 np0005596062 nova_compute[227313]: 2026-01-26 18:21:03.436 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:21:03 np0005596062 nova_compute[227313]: 2026-01-26 18:21:03.443 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:21:03 np0005596062 nova_compute[227313]: 2026-01-26 18:21:03.462 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:21:03 np0005596062 nova_compute[227313]: 2026-01-26 18:21:03.465 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:21:03 np0005596062 nova_compute[227313]: 2026-01-26 18:21:03.465 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:21:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:04.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:04.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:04 np0005596062 nova_compute[227313]: 2026-01-26 18:21:04.466 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:04 np0005596062 nova_compute[227313]: 2026-01-26 18:21:04.467 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:05 np0005596062 nova_compute[227313]: 2026-01-26 18:21:05.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:05 np0005596062 nova_compute[227313]: 2026-01-26 18:21:05.741 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:06 np0005596062 nova_compute[227313]: 2026-01-26 18:21:06.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:06 np0005596062 nova_compute[227313]: 2026-01-26 18:21:06.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:21:06 np0005596062 nova_compute[227313]: 2026-01-26 18:21:06.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:21:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:06.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:06.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:07 np0005596062 nova_compute[227313]: 2026-01-26 18:21:07.187 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:07 np0005596062 nova_compute[227313]: 2026-01-26 18:21:07.298 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:21:07 np0005596062 nova_compute[227313]: 2026-01-26 18:21:07.299 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:07 np0005596062 nova_compute[227313]: 2026-01-26 18:21:07.299 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:21:08 np0005596062 nova_compute[227313]: 2026-01-26 18:21:08.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:08 np0005596062 nova_compute[227313]: 2026-01-26 18:21:08.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:08.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:08.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:09 np0005596062 nova_compute[227313]: 2026-01-26 18:21:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:21:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:21:09.170 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:21:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:21:09.171 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:21:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:21:09.171 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:21:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:10.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:10.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:10 np0005596062 nova_compute[227313]: 2026-01-26 18:21:10.786 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:12 np0005596062 nova_compute[227313]: 2026-01-26 18:21:12.191 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:12.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:12.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:21:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:21:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:21:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:14.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:14.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:15 np0005596062 nova_compute[227313]: 2026-01-26 18:21:15.821 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:16.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:16.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:17 np0005596062 nova_compute[227313]: 2026-01-26 18:21:17.231 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:17 np0005596062 podman[246144]: 2026-01-26 18:21:17.851177396 +0000 UTC m=+0.059212353 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 26 13:21:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:18.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:18.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:21:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:21:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:20.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:20.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:20 np0005596062 nova_compute[227313]: 2026-01-26 18:21:20.822 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:22.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:22 np0005596062 nova_compute[227313]: 2026-01-26 18:21:22.234 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:22.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 13:21:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:24.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 13:21:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:24.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:25 np0005596062 nova_compute[227313]: 2026-01-26 18:21:25.824 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:26.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:26.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:27 np0005596062 nova_compute[227313]: 2026-01-26 18:21:27.237 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:28.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:28.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:28 np0005596062 podman[246220]: 2026-01-26 18:21:28.879230418 +0000 UTC m=+0.087609043 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 13:21:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:21:29.449 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:21:29 np0005596062 nova_compute[227313]: 2026-01-26 18:21:29.450 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:21:29.451 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:21:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:30.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:30.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:30 np0005596062 nova_compute[227313]: 2026-01-26 18:21:30.828 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:32.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:32 np0005596062 nova_compute[227313]: 2026-01-26 18:21:32.240 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:32.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:34.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:34.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:35 np0005596062 nova_compute[227313]: 2026-01-26 18:21:35.830 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:36.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:36.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:21:36.453 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:21:37 np0005596062 nova_compute[227313]: 2026-01-26 18:21:37.243 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:38.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:38.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:40.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:40.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:40 np0005596062 nova_compute[227313]: 2026-01-26 18:21:40.869 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:42.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:42 np0005596062 nova_compute[227313]: 2026-01-26 18:21:42.245 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:42.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:21:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6547 writes, 32K keys, 6547 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6547 writes, 6547 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1539 writes, 7037 keys, 1539 commit groups, 1.0 writes per commit group, ingest: 15.69 MB, 0.03 MB/s#012Interval WAL: 1539 writes, 1539 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     70.6      0.55              0.13        17    0.032       0      0       0.0       0.0#012  L6      1/0    9.02 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.4     79.9     65.3      2.03              0.43        16    0.127     79K   8910       0.0       0.0#012 Sum      1/0    9.02 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.4     62.9     66.4      2.58              0.56        33    0.078     79K   8910       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.0    137.1    136.6      0.22              0.09         6    0.037     17K   1987       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0     79.9     65.3      2.03              0.43        16    0.127     79K   8910       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     70.8      0.54              0.13        16    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.038, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.17 GB write, 0.07 MB/s write, 0.16 GB read, 0.07 MB/s read, 2.6 seconds#012Interval compaction: 0.03 GB write, 0.05 MB/s write, 0.03 GB read, 0.05 MB/s read, 0.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 17.71 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000285 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1014,17.08 MB,5.6197%) FilterBlock(33,228.61 KB,0.0734379%) IndexBlock(33,416.77 KB,0.133881%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 13:21:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:44.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:44.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:45 np0005596062 nova_compute[227313]: 2026-01-26 18:21:45.871 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:21:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:46.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:21:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:46.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:47 np0005596062 nova_compute[227313]: 2026-01-26 18:21:47.248 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:48.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:48 np0005596062 podman[246306]: 2026-01-26 18:21:48.849822458 +0000 UTC m=+0.056212913 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 26 13:21:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:50.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:50.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:50 np0005596062 nova_compute[227313]: 2026-01-26 18:21:50.923 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:52.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:52 np0005596062 nova_compute[227313]: 2026-01-26 18:21:52.294 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:21:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:52.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:21:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:54.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:54.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:55 np0005596062 nova_compute[227313]: 2026-01-26 18:21:55.927 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:56.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:56.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:57 np0005596062 nova_compute[227313]: 2026-01-26 18:21:57.297 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:21:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:21:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:21:58.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:21:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:21:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:21:58.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:21:59 np0005596062 podman[246380]: 2026-01-26 18:21:59.884603659 +0000 UTC m=+0.089837612 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 13:22:00 np0005596062 nova_compute[227313]: 2026-01-26 18:22:00.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:00 np0005596062 nova_compute[227313]: 2026-01-26 18:22:00.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:22:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:00.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:00.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:00 np0005596062 nova_compute[227313]: 2026-01-26 18:22:00.929 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:02.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.300 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:02.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.419 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.680 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.681 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.681 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.682 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:22:02 np0005596062 nova_compute[227313]: 2026-01-26 18:22:02.682 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:22:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:22:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3853197927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.148 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.326 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.328 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4833MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.328 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.328 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.530 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.531 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:22:03 np0005596062 nova_compute[227313]: 2026-01-26 18:22:03.607 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:22:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:04.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:04.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.021 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.022 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.053 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.094 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.165 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:22:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:22:05 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/938699993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.609 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.618 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:22:05 np0005596062 nova_compute[227313]: 2026-01-26 18:22:05.931 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:06.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:06.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:07 np0005596062 nova_compute[227313]: 2026-01-26 18:22:07.304 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:08.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:08.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.420007) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451728420069, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2407, "num_deletes": 254, "total_data_size": 5912448, "memory_usage": 5979096, "flush_reason": "Manual Compaction"}
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451728446021, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3866524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31082, "largest_seqno": 33484, "table_properties": {"data_size": 3856718, "index_size": 6300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19873, "raw_average_key_size": 20, "raw_value_size": 3837177, "raw_average_value_size": 3951, "num_data_blocks": 275, "num_entries": 971, "num_filter_entries": 971, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451512, "oldest_key_time": 1769451512, "file_creation_time": 1769451728, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 26099 microseconds, and 11853 cpu microseconds.
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.446095) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3866524 bytes OK
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.446129) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.448036) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.448054) EVENT_LOG_v1 {"time_micros": 1769451728448048, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.448082) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5901981, prev total WAL file size 5901981, number of live WAL files 2.
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.450450) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3775KB)], [60(9236KB)]
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451728450509, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 13325041, "oldest_snapshot_seqno": -1}
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5861 keys, 11222007 bytes, temperature: kUnknown
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451728529651, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 11222007, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11180519, "index_size": 25762, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14661, "raw_key_size": 148286, "raw_average_key_size": 25, "raw_value_size": 11072529, "raw_average_value_size": 1889, "num_data_blocks": 1046, "num_entries": 5861, "num_filter_entries": 5861, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451728, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:22:08 np0005596062 nova_compute[227313]: 2026-01-26 18:22:08.528 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:22:08 np0005596062 nova_compute[227313]: 2026-01-26 18:22:08.530 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:22:08 np0005596062 nova_compute[227313]: 2026-01-26 18:22:08.530 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.530054) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 11222007 bytes
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.531622) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 168.0 rd, 141.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 9.0 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(6.3) write-amplify(2.9) OK, records in: 6386, records dropped: 525 output_compression: NoCompression
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.531647) EVENT_LOG_v1 {"time_micros": 1769451728531634, "job": 36, "event": "compaction_finished", "compaction_time_micros": 79302, "compaction_time_cpu_micros": 37337, "output_level": 6, "num_output_files": 1, "total_output_size": 11222007, "num_input_records": 6386, "num_output_records": 5861, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:22:08 np0005596062 nova_compute[227313]: 2026-01-26 18:22:08.531 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451728532928, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 26 13:22:08 np0005596062 nova_compute[227313]: 2026-01-26 18:22:08.532 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451728535418, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.450296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.535565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.535574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.535577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.535579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:22:08 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:22:08.535583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:22:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:22:09.171 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:22:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:22:09.172 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:22:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:22:09.172 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:22:09 np0005596062 nova_compute[227313]: 2026-01-26 18:22:09.308 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:22:09 np0005596062 nova_compute[227313]: 2026-01-26 18:22:09.309 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:09 np0005596062 nova_compute[227313]: 2026-01-26 18:22:09.996 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:09 np0005596062 nova_compute[227313]: 2026-01-26 18:22:09.997 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:09 np0005596062 nova_compute[227313]: 2026-01-26 18:22:09.997 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:22:09 np0005596062 nova_compute[227313]: 2026-01-26 18:22:09.997 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.042 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.042 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.043 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.043 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.043 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.044 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.044 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:10.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:10.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:10 np0005596062 nova_compute[227313]: 2026-01-26 18:22:10.934 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:11 np0005596062 nova_compute[227313]: 2026-01-26 18:22:11.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:12.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:12 np0005596062 nova_compute[227313]: 2026-01-26 18:22:12.306 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:12.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:14.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:14.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:15 np0005596062 nova_compute[227313]: 2026-01-26 18:22:15.936 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:22:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 12K writes, 3811 syncs, 3.26 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2626 writes, 7335 keys, 2626 commit groups, 1.0 writes per commit group, ingest: 5.16 MB, 0.01 MB/s#012Interval WAL: 2626 writes, 1146 syncs, 2.29 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:22:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:16.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:16.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:17 np0005596062 nova_compute[227313]: 2026-01-26 18:22:17.319 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:18.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:18.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:19 np0005596062 podman[246560]: 2026-01-26 18:22:19.010547961 +0000 UTC m=+0.060181790 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:22:19 np0005596062 podman[246700]: 2026-01-26 18:22:19.608589773 +0000 UTC m=+0.076921747 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 13:22:19 np0005596062 podman[246700]: 2026-01-26 18:22:19.712445849 +0000 UTC m=+0.180777833 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 13:22:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:20.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:20.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:20 np0005596062 podman[246857]: 2026-01-26 18:22:20.401968755 +0000 UTC m=+0.059385588 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:22:20 np0005596062 podman[246857]: 2026-01-26 18:22:20.433823637 +0000 UTC m=+0.091240460 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:22:20 np0005596062 podman[246921]: 2026-01-26 18:22:20.656998991 +0000 UTC m=+0.055553546 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, release=1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, distribution-scope=public, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, vendor=Red Hat, Inc.)
Jan 26 13:22:20 np0005596062 podman[246921]: 2026-01-26 18:22:20.669825804 +0000 UTC m=+0.068380339 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.openshift.expose-services=, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, distribution-scope=public, architecture=x86_64, io.buildah.version=1.28.2, name=keepalived)
Jan 26 13:22:20 np0005596062 nova_compute[227313]: 2026-01-26 18:22:20.939 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:22:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:22:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:22:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:22:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:22:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:22.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:22 np0005596062 nova_compute[227313]: 2026-01-26 18:22:22.322 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:22.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:24.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:24.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:25 np0005596062 nova_compute[227313]: 2026-01-26 18:22:25.941 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:26.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:26.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:27 np0005596062 nova_compute[227313]: 2026-01-26 18:22:27.327 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:28.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:28.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:22:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:22:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:30.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:30.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:30 np0005596062 nova_compute[227313]: 2026-01-26 18:22:30.590 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:22:30 np0005596062 podman[247140]: 2026-01-26 18:22:30.90620478 +0000 UTC m=+0.103267525 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 13:22:30 np0005596062 nova_compute[227313]: 2026-01-26 18:22:30.943 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:32.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:32 np0005596062 nova_compute[227313]: 2026-01-26 18:22:32.330 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:32.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:34.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:35 np0005596062 nova_compute[227313]: 2026-01-26 18:22:35.945 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:36.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:36.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:37 np0005596062 nova_compute[227313]: 2026-01-26 18:22:37.351 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:22:37.848 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:22:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:22:37.849 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:22:37 np0005596062 nova_compute[227313]: 2026-01-26 18:22:37.850 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:38.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:38.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:22:39.851 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:22:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:40.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:40.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:40 np0005596062 nova_compute[227313]: 2026-01-26 18:22:40.998 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:42 np0005596062 nova_compute[227313]: 2026-01-26 18:22:42.354 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:42.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:44.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:46 np0005596062 nova_compute[227313]: 2026-01-26 18:22:46.001 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:46.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:46.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:47 np0005596062 nova_compute[227313]: 2026-01-26 18:22:47.357 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:48.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:48.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:49 np0005596062 podman[247227]: 2026-01-26 18:22:49.855873149 +0000 UTC m=+0.062165383 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 26 13:22:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:50.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:51 np0005596062 nova_compute[227313]: 2026-01-26 18:22:51.004 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:52 np0005596062 nova_compute[227313]: 2026-01-26 18:22:52.359 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:22:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:52.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:22:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 26 13:22:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 26 13:22:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:22:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:54.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:22:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:54.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:56 np0005596062 nova_compute[227313]: 2026-01-26 18:22:56.007 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:56.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:56.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:57 np0005596062 nova_compute[227313]: 2026-01-26 18:22:57.363 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:22:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:22:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:22:58.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:22:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:22:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:22:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:22:58.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:00.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:00.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:01 np0005596062 nova_compute[227313]: 2026-01-26 18:23:01.042 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:01 np0005596062 podman[247304]: 2026-01-26 18:23:01.895672086 +0000 UTC m=+0.106297346 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 13:23:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:02.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:02 np0005596062 nova_compute[227313]: 2026-01-26 18:23:02.366 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:02.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.077 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.079 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.079 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.079 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.079 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:23:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:23:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/173953021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.559 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.744 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.746 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4837MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.746 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.747 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.925 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.926 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:23:03 np0005596062 nova_compute[227313]: 2026-01-26 18:23:03.951 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:23:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:04.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:23:04 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2137023878' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:23:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:04.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:04 np0005596062 nova_compute[227313]: 2026-01-26 18:23:04.475 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:23:04 np0005596062 nova_compute[227313]: 2026-01-26 18:23:04.483 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:23:04 np0005596062 nova_compute[227313]: 2026-01-26 18:23:04.544 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:23:04 np0005596062 nova_compute[227313]: 2026-01-26 18:23:04.546 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:23:04 np0005596062 nova_compute[227313]: 2026-01-26 18:23:04.546 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:23:06 np0005596062 nova_compute[227313]: 2026-01-26 18:23:06.044 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:06.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:06.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:06 np0005596062 nova_compute[227313]: 2026-01-26 18:23:06.547 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:07 np0005596062 nova_compute[227313]: 2026-01-26 18:23:07.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:07 np0005596062 nova_compute[227313]: 2026-01-26 18:23:07.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:23:07 np0005596062 nova_compute[227313]: 2026-01-26 18:23:07.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:23:07 np0005596062 nova_compute[227313]: 2026-01-26 18:23:07.369 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:07 np0005596062 nova_compute[227313]: 2026-01-26 18:23:07.691 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:23:08 np0005596062 nova_compute[227313]: 2026-01-26 18:23:08.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:08 np0005596062 nova_compute[227313]: 2026-01-26 18:23:08.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:23:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:08.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:23:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:08.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:09 np0005596062 nova_compute[227313]: 2026-01-26 18:23:09.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:09 np0005596062 nova_compute[227313]: 2026-01-26 18:23:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:09 np0005596062 nova_compute[227313]: 2026-01-26 18:23:09.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:23:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:23:09.172 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:23:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:23:09.172 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:23:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:23:09.172 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.483883) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451789484030, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1111, "num_deletes": 505, "total_data_size": 1658454, "memory_usage": 1683288, "flush_reason": "Manual Compaction"}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451789495559, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 725128, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33490, "largest_seqno": 34595, "table_properties": {"data_size": 721111, "index_size": 1221, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13152, "raw_average_key_size": 19, "raw_value_size": 710643, "raw_average_value_size": 1031, "num_data_blocks": 53, "num_entries": 689, "num_filter_entries": 689, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451729, "oldest_key_time": 1769451729, "file_creation_time": 1769451789, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 11706 microseconds, and 4272 cpu microseconds.
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.495628) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 725128 bytes OK
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.495659) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.498187) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.498206) EVENT_LOG_v1 {"time_micros": 1769451789498199, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.498234) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 1652080, prev total WAL file size 1652080, number of live WAL files 2.
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.499206) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303033' seq:72057594037927935, type:22 .. '6D6772737461740031323534' seq:0, type:0; will stop at (end)
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(708KB)], [63(10MB)]
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451789499247, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 11947135, "oldest_snapshot_seqno": -1}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 5549 keys, 8352890 bytes, temperature: kUnknown
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451789549945, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 8352890, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8316919, "index_size": 21013, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13893, "raw_key_size": 143156, "raw_average_key_size": 25, "raw_value_size": 8217823, "raw_average_value_size": 1480, "num_data_blocks": 844, "num_entries": 5549, "num_filter_entries": 5549, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451789, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.550165) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 8352890 bytes
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.551756) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.4 rd, 164.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 10.7 +0.0 blob) out(8.0 +0.0 blob), read-write-amplify(28.0) write-amplify(11.5) OK, records in: 6550, records dropped: 1001 output_compression: NoCompression
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.551773) EVENT_LOG_v1 {"time_micros": 1769451789551765, "job": 38, "event": "compaction_finished", "compaction_time_micros": 50761, "compaction_time_cpu_micros": 23403, "output_level": 6, "num_output_files": 1, "total_output_size": 8352890, "num_input_records": 6550, "num_output_records": 5549, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451789551986, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451789554074, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.499073) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.554157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.554161) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.554163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.554164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:23:09 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:23:09.554166) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:23:10 np0005596062 nova_compute[227313]: 2026-01-26 18:23:10.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:23:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:10.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:23:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:10.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:11 np0005596062 nova_compute[227313]: 2026-01-26 18:23:11.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:23:11 np0005596062 nova_compute[227313]: 2026-01-26 18:23:11.095 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:12.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:12 np0005596062 nova_compute[227313]: 2026-01-26 18:23:12.371 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:12.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:14.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:14.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:16 np0005596062 nova_compute[227313]: 2026-01-26 18:23:16.097 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:16.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:16.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:17 np0005596062 nova_compute[227313]: 2026-01-26 18:23:17.375 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:18.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:18.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:20.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:20.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:20 np0005596062 podman[247437]: 2026-01-26 18:23:20.843566629 +0000 UTC m=+0.053623346 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 13:23:21 np0005596062 nova_compute[227313]: 2026-01-26 18:23:21.100 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:22.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:22 np0005596062 nova_compute[227313]: 2026-01-26 18:23:22.377 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:22.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:24.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:24.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:26 np0005596062 nova_compute[227313]: 2026-01-26 18:23:26.102 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:26.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:26.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:27 np0005596062 nova_compute[227313]: 2026-01-26 18:23:27.380 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:28.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:28.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:30.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:30.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:31 np0005596062 nova_compute[227313]: 2026-01-26 18:23:31.154 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:23:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:31 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:23:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:32.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:32 np0005596062 nova_compute[227313]: 2026-01-26 18:23:32.384 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:32.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:32 np0005596062 podman[247593]: 2026-01-26 18:23:32.88238765 +0000 UTC m=+0.086553971 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 13:23:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:34.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:34.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:36 np0005596062 nova_compute[227313]: 2026-01-26 18:23:36.157 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:36.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:36.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:37 np0005596062 nova_compute[227313]: 2026-01-26 18:23:37.387 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:38.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:38.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:39 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:39 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:23:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:23:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2601084213' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:23:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:23:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2601084213' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:23:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:40.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:40.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:41 np0005596062 nova_compute[227313]: 2026-01-26 18:23:41.159 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:42.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:42 np0005596062 nova_compute[227313]: 2026-01-26 18:23:42.389 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:42.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 26 13:23:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:44.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:44.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:46 np0005596062 nova_compute[227313]: 2026-01-26 18:23:46.163 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:46.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:46.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:47 np0005596062 nova_compute[227313]: 2026-01-26 18:23:47.393 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:47 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:23:47.414 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:23:47 np0005596062 nova_compute[227313]: 2026-01-26 18:23:47.414 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:47 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:23:47.415 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:23:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:48.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:48.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:50.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:50.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:51 np0005596062 nova_compute[227313]: 2026-01-26 18:23:51.216 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:51 np0005596062 podman[247728]: 2026-01-26 18:23:51.847726255 +0000 UTC m=+0.058266470 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:23:52 np0005596062 nova_compute[227313]: 2026-01-26 18:23:52.396 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:52.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:52 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:23:52.417 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:23:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:52.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:54.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:54.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:56 np0005596062 nova_compute[227313]: 2026-01-26 18:23:56.219 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:56.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:23:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:56.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:23:57 np0005596062 nova_compute[227313]: 2026-01-26 18:23:57.398 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:23:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:23:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:23:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:23:58.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:23:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:23:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:23:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:23:58.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:00.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:00.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:01 np0005596062 nova_compute[227313]: 2026-01-26 18:24:01.223 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:02 np0005596062 nova_compute[227313]: 2026-01-26 18:24:02.402 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:02.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:02.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:03 np0005596062 podman[247803]: 2026-01-26 18:24:03.933942866 +0000 UTC m=+0.140578686 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 13:24:04 np0005596062 nova_compute[227313]: 2026-01-26 18:24:04.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:04.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:04.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:06 np0005596062 nova_compute[227313]: 2026-01-26 18:24:06.255 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:06.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:06.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:07 np0005596062 nova_compute[227313]: 2026-01-26 18:24:07.404 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:08.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:08.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:24:09.173 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:24:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:24:09.173 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:24:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:24:09.173 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:24:09 np0005596062 nova_compute[227313]: 2026-01-26 18:24:09.929 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:24:09 np0005596062 nova_compute[227313]: 2026-01-26 18:24:09.930 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:24:09 np0005596062 nova_compute[227313]: 2026-01-26 18:24:09.930 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:24:09 np0005596062 nova_compute[227313]: 2026-01-26 18:24:09.930 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:24:09 np0005596062 nova_compute[227313]: 2026-01-26 18:24:09.930 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:24:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:24:10 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/712913086' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:24:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:10.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:10 np0005596062 nova_compute[227313]: 2026-01-26 18:24:10.439 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:24:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:10.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:10 np0005596062 nova_compute[227313]: 2026-01-26 18:24:10.588 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:24:10 np0005596062 nova_compute[227313]: 2026-01-26 18:24:10.589 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4847MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:24:10 np0005596062 nova_compute[227313]: 2026-01-26 18:24:10.589 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:24:10 np0005596062 nova_compute[227313]: 2026-01-26 18:24:10.590 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:24:11 np0005596062 nova_compute[227313]: 2026-01-26 18:24:11.257 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:11 np0005596062 nova_compute[227313]: 2026-01-26 18:24:11.355 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:24:11 np0005596062 nova_compute[227313]: 2026-01-26 18:24:11.356 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:24:11 np0005596062 nova_compute[227313]: 2026-01-26 18:24:11.386 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:24:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:24:11 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2423529415' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:24:11 np0005596062 nova_compute[227313]: 2026-01-26 18:24:11.865 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:24:11 np0005596062 nova_compute[227313]: 2026-01-26 18:24:11.872 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:24:12 np0005596062 nova_compute[227313]: 2026-01-26 18:24:12.407 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:12.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:12.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:14.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:14 np0005596062 nova_compute[227313]: 2026-01-26 18:24:14.522 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:24:14 np0005596062 nova_compute[227313]: 2026-01-26 18:24:14.524 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:24:14 np0005596062 nova_compute[227313]: 2026-01-26 18:24:14.524 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.934s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:24:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:14.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.259 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:16.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.525 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.526 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.553 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.553 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.553 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.569 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.569 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.570 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.570 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.570 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.570 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.570 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:24:16 np0005596062 nova_compute[227313]: 2026-01-26 18:24:16.570 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:24:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:16.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:17 np0005596062 nova_compute[227313]: 2026-01-26 18:24:17.411 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:18.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:18.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:20.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:20.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:21 np0005596062 nova_compute[227313]: 2026-01-26 18:24:21.278 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:22 np0005596062 nova_compute[227313]: 2026-01-26 18:24:22.414 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:22.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:22.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:22 np0005596062 podman[247933]: 2026-01-26 18:24:22.850339952 +0000 UTC m=+0.059468831 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 13:24:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:24.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:24.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:26 np0005596062 nova_compute[227313]: 2026-01-26 18:24:26.280 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:26.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:26.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:27 np0005596062 nova_compute[227313]: 2026-01-26 18:24:27.417 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:24:28.019 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:24:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:24:28.021 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:24:28 np0005596062 nova_compute[227313]: 2026-01-26 18:24:28.020 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:28.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:28.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:30.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:30.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:24:31.023 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:24:31 np0005596062 nova_compute[227313]: 2026-01-26 18:24:31.325 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:32 np0005596062 nova_compute[227313]: 2026-01-26 18:24:32.445 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:32.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:32.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:34.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:24:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:34.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:24:34 np0005596062 podman[247960]: 2026-01-26 18:24:34.893784288 +0000 UTC m=+0.099127185 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:24:36 np0005596062 nova_compute[227313]: 2026-01-26 18:24:36.327 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:36.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:36.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:37 np0005596062 nova_compute[227313]: 2026-01-26 18:24:37.447 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:38.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:38.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 13:24:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:24:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 13:24:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:24:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:40.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:40.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:41 np0005596062 nova_compute[227313]: 2026-01-26 18:24:41.329 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:24:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:24:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:24:42 np0005596062 nova_compute[227313]: 2026-01-26 18:24:42.449 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:42.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:42.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:44.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:44.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:46 np0005596062 nova_compute[227313]: 2026-01-26 18:24:46.379 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:46.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:46.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:47 np0005596062 nova_compute[227313]: 2026-01-26 18:24:47.495 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:48.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:48.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:24:48 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/870584699' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:24:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:24:48 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/870584699' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:24:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:24:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:24:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:50.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:50.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:51 np0005596062 nova_compute[227313]: 2026-01-26 18:24:51.382 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:52.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:52 np0005596062 nova_compute[227313]: 2026-01-26 18:24:52.498 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:52.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:53 np0005596062 podman[248348]: 2026-01-26 18:24:53.844014139 +0000 UTC m=+0.055937738 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:24:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:24:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:54.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:24:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:54.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:24:56 np0005596062 nova_compute[227313]: 2026-01-26 18:24:56.384 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:56.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:56.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:57 np0005596062 nova_compute[227313]: 2026-01-26 18:24:57.499 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:24:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:24:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:24:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:24:58.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:24:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:24:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:24:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:24:58.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:00.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:00.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:01 np0005596062 nova_compute[227313]: 2026-01-26 18:25:01.387 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:02.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:02 np0005596062 nova_compute[227313]: 2026-01-26 18:25:02.501 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:02.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.122 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.123 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.123 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.123 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.124 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:25:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:04.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:04 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:25:04 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/966078527' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.622 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:25:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:04.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.793 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.795 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4842MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.795 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:25:04 np0005596062 nova_compute[227313]: 2026-01-26 18:25:04.795 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.210 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.211 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.300 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:25:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:25:05 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3036036379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.792 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.799 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:25:05 np0005596062 podman[248466]: 2026-01-26 18:25:05.889842028 +0000 UTC m=+0.094644486 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.920 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.922 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:25:05 np0005596062 nova_compute[227313]: 2026-01-26 18:25:05.922 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:25:06 np0005596062 nova_compute[227313]: 2026-01-26 18:25:06.390 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:06.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:06.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:07 np0005596062 nova_compute[227313]: 2026-01-26 18:25:07.504 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:08.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:08.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:08 np0005596062 nova_compute[227313]: 2026-01-26 18:25:08.924 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:09 np0005596062 nova_compute[227313]: 2026-01-26 18:25:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:09 np0005596062 nova_compute[227313]: 2026-01-26 18:25:09.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:25:09 np0005596062 nova_compute[227313]: 2026-01-26 18:25:09.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:25:09 np0005596062 nova_compute[227313]: 2026-01-26 18:25:09.077 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:25:09 np0005596062 nova_compute[227313]: 2026-01-26 18:25:09.078 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:09.174 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:25:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:09.175 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:25:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:09.175 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:25:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:10.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:10.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:11 np0005596062 nova_compute[227313]: 2026-01-26 18:25:11.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:11 np0005596062 nova_compute[227313]: 2026-01-26 18:25:11.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:11 np0005596062 nova_compute[227313]: 2026-01-26 18:25:11.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:25:11 np0005596062 nova_compute[227313]: 2026-01-26 18:25:11.392 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:12 np0005596062 nova_compute[227313]: 2026-01-26 18:25:12.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 26 13:25:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:12.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:12 np0005596062 nova_compute[227313]: 2026-01-26 18:25:12.505 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:12.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:13 np0005596062 nova_compute[227313]: 2026-01-26 18:25:13.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:13 np0005596062 nova_compute[227313]: 2026-01-26 18:25:13.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:25:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:13.084 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:25:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:13.085 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:25:13 np0005596062 nova_compute[227313]: 2026-01-26 18:25:13.151 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:14.089 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:25:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:14.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:14.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:16 np0005596062 nova_compute[227313]: 2026-01-26 18:25:16.395 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:16.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:16.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:17 np0005596062 nova_compute[227313]: 2026-01-26 18:25:17.510 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:18.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:18.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 26 13:25:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:20.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:20.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:21 np0005596062 nova_compute[227313]: 2026-01-26 18:25:21.397 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:22.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:22 np0005596062 nova_compute[227313]: 2026-01-26 18:25:22.527 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:22.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:24.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:24.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:24 np0005596062 podman[248553]: 2026-01-26 18:25:24.884480192 +0000 UTC m=+0.085357459 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 13:25:26 np0005596062 nova_compute[227313]: 2026-01-26 18:25:26.399 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:26.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:26.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:27 np0005596062 nova_compute[227313]: 2026-01-26 18:25:27.530 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:28.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:28.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:30.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:30.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:31 np0005596062 nova_compute[227313]: 2026-01-26 18:25:31.401 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:32.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:32 np0005596062 nova_compute[227313]: 2026-01-26 18:25:32.533 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:32.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:34.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:34.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:36 np0005596062 nova_compute[227313]: 2026-01-26 18:25:36.403 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:36.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.525543) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451936525668, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1775, "num_deletes": 252, "total_data_size": 4117328, "memory_usage": 4170136, "flush_reason": "Manual Compaction"}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451936549374, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2693770, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34600, "largest_seqno": 36370, "table_properties": {"data_size": 2686325, "index_size": 4388, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15918, "raw_average_key_size": 20, "raw_value_size": 2671345, "raw_average_value_size": 3433, "num_data_blocks": 191, "num_entries": 778, "num_filter_entries": 778, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451790, "oldest_key_time": 1769451790, "file_creation_time": 1769451936, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 24005 microseconds, and 7988 cpu microseconds.
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.549565) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2693770 bytes OK
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.549635) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.551971) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.552015) EVENT_LOG_v1 {"time_micros": 1769451936552002, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.552040) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 4109307, prev total WAL file size 4109944, number of live WAL files 2.
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.553852) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2630KB)], [66(8157KB)]
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451936553901, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 11046660, "oldest_snapshot_seqno": -1}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 5802 keys, 9074951 bytes, temperature: kUnknown
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451936658747, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 9074951, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9036758, "index_size": 22595, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14533, "raw_key_size": 149194, "raw_average_key_size": 25, "raw_value_size": 8932697, "raw_average_value_size": 1539, "num_data_blocks": 907, "num_entries": 5802, "num_filter_entries": 5802, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769451936, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.659360) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 9074951 bytes
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.666995) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.3 rd, 86.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 8.0 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(7.5) write-amplify(3.4) OK, records in: 6327, records dropped: 525 output_compression: NoCompression
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.667071) EVENT_LOG_v1 {"time_micros": 1769451936667052, "job": 40, "event": "compaction_finished", "compaction_time_micros": 104953, "compaction_time_cpu_micros": 25637, "output_level": 6, "num_output_files": 1, "total_output_size": 9074951, "num_input_records": 6327, "num_output_records": 5802, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451936667981, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769451936670025, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.553745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.670117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.670125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.670129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.670132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:25:36 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:25:36.670134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:25:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:36.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:36 np0005596062 podman[248629]: 2026-01-26 18:25:36.889803463 +0000 UTC m=+0.095654793 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:25:37 np0005596062 nova_compute[227313]: 2026-01-26 18:25:37.535 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:38.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:38.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:40.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:40.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:41 np0005596062 nova_compute[227313]: 2026-01-26 18:25:41.405 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:42.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:42 np0005596062 nova_compute[227313]: 2026-01-26 18:25:42.591 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:42.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:44.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:44.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:46 np0005596062 nova_compute[227313]: 2026-01-26 18:25:46.409 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:46.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:46.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:47 np0005596062 nova_compute[227313]: 2026-01-26 18:25:47.593 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:48.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:48.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 13:25:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:25:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:25:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:25:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:50.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:50.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:51 np0005596062 nova_compute[227313]: 2026-01-26 18:25:51.412 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:52.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:52 np0005596062 nova_compute[227313]: 2026-01-26 18:25:52.596 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:25:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:52.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:25:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:53.463 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:25:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:25:53.464 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:25:53 np0005596062 nova_compute[227313]: 2026-01-26 18:25:53.464 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:54.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:54.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:55 np0005596062 podman[248820]: 2026-01-26 18:25:55.835156548 +0000 UTC m=+0.051823418 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 13:25:56 np0005596062 nova_compute[227313]: 2026-01-26 18:25:56.452 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:56.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:25:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:56.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:25:57 np0005596062 nova_compute[227313]: 2026-01-26 18:25:57.598 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:25:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:25:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:25:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:25:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:25:58.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:25:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:25:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:25:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:25:58.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:00.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:00.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:01 np0005596062 nova_compute[227313]: 2026-01-26 18:26:01.455 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:01 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:01.465 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:26:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:02.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:02 np0005596062 nova_compute[227313]: 2026-01-26 18:26:02.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:02.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:04.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:04.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.088 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.088 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.088 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.089 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.089 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.457 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:26:06 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/291937883' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.599 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:26:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:06.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.799 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.800 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4865MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.801 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.802 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.878 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.879 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:26:06 np0005596062 nova_compute[227313]: 2026-01-26 18:26:06.900 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:26:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:26:07 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/191631852' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:26:07 np0005596062 nova_compute[227313]: 2026-01-26 18:26:07.343 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:26:07 np0005596062 nova_compute[227313]: 2026-01-26 18:26:07.356 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:26:07 np0005596062 nova_compute[227313]: 2026-01-26 18:26:07.382 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:26:07 np0005596062 nova_compute[227313]: 2026-01-26 18:26:07.383 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:26:07 np0005596062 nova_compute[227313]: 2026-01-26 18:26:07.384 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:26:07 np0005596062 nova_compute[227313]: 2026-01-26 18:26:07.602 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:07 np0005596062 podman[248962]: 2026-01-26 18:26:07.887812637 +0000 UTC m=+0.100595144 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 26 13:26:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:08.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:08.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:09.176 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:26:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:09.178 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:26:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:09.178 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:26:09 np0005596062 nova_compute[227313]: 2026-01-26 18:26:09.385 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 26 13:26:10 np0005596062 nova_compute[227313]: 2026-01-26 18:26:10.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:10.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:10.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:11 np0005596062 nova_compute[227313]: 2026-01-26 18:26:11.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:11 np0005596062 nova_compute[227313]: 2026-01-26 18:26:11.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:26:11 np0005596062 nova_compute[227313]: 2026-01-26 18:26:11.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:26:11 np0005596062 nova_compute[227313]: 2026-01-26 18:26:11.104 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:26:11 np0005596062 nova_compute[227313]: 2026-01-26 18:26:11.460 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:12 np0005596062 nova_compute[227313]: 2026-01-26 18:26:12.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:12 np0005596062 nova_compute[227313]: 2026-01-26 18:26:12.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:12 np0005596062 nova_compute[227313]: 2026-01-26 18:26:12.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:26:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:12.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:12 np0005596062 nova_compute[227313]: 2026-01-26 18:26:12.605 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:12.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:13 np0005596062 nova_compute[227313]: 2026-01-26 18:26:13.045 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:14 np0005596062 nova_compute[227313]: 2026-01-26 18:26:14.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:14.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:14.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:15 np0005596062 nova_compute[227313]: 2026-01-26 18:26:15.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:16 np0005596062 nova_compute[227313]: 2026-01-26 18:26:16.461 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:26:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2772374308' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:26:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:26:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2772374308' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:26:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:16.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:16.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:17 np0005596062 nova_compute[227313]: 2026-01-26 18:26:17.055 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:26:17 np0005596062 nova_compute[227313]: 2026-01-26 18:26:17.617 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 26 13:26:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:18.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:18.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:20.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:20.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:21 np0005596062 nova_compute[227313]: 2026-01-26 18:26:21.465 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 26 13:26:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:22.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:22 np0005596062 nova_compute[227313]: 2026-01-26 18:26:22.619 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:22.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:24.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 26 13:26:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:24.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:26 np0005596062 nova_compute[227313]: 2026-01-26 18:26:26.467 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:26.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:26.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:26 np0005596062 podman[249049]: 2026-01-26 18:26:26.847603266 +0000 UTC m=+0.056243715 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 13:26:27 np0005596062 nova_compute[227313]: 2026-01-26 18:26:27.620 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:28.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:28.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:30.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:30.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:31.380 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:26:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:31.381 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:26:31 np0005596062 nova_compute[227313]: 2026-01-26 18:26:31.380 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:31 np0005596062 nova_compute[227313]: 2026-01-26 18:26:31.503 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 26 13:26:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:32.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:32 np0005596062 nova_compute[227313]: 2026-01-26 18:26:32.671 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:32.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:34.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:34.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:36 np0005596062 nova_compute[227313]: 2026-01-26 18:26:36.505 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:36.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:36.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:37 np0005596062 nova_compute[227313]: 2026-01-26 18:26:37.672 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:38.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:38.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:38 np0005596062 podman[249123]: 2026-01-26 18:26:38.939575259 +0000 UTC m=+0.143438522 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 13:26:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:26:39.383 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:26:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 26 13:26:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:40.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:40.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:41 np0005596062 nova_compute[227313]: 2026-01-26 18:26:41.509 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:42.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:42 np0005596062 nova_compute[227313]: 2026-01-26 18:26:42.673 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:42.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:44.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:44.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:46 np0005596062 nova_compute[227313]: 2026-01-26 18:26:46.512 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:46.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:46.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:47 np0005596062 nova_compute[227313]: 2026-01-26 18:26:47.675 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:48.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:48.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:50.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:50.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:51 np0005596062 nova_compute[227313]: 2026-01-26 18:26:51.553 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:52.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:52 np0005596062 nova_compute[227313]: 2026-01-26 18:26:52.681 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:52.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:26:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:54.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:26:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:54.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:56 np0005596062 nova_compute[227313]: 2026-01-26 18:26:56.555 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:56.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:56.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:26:57 np0005596062 podman[249232]: 2026-01-26 18:26:57.546656306 +0000 UTC m=+0.050550744 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 26 13:26:57 np0005596062 nova_compute[227313]: 2026-01-26 18:26:57.682 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:26:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:26:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:26:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:26:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:26:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:26:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:26:58.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:26:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:26:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:26:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:26:58.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:00.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:00.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:01 np0005596062 nova_compute[227313]: 2026-01-26 18:27:01.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:01 np0005596062 nova_compute[227313]: 2026-01-26 18:27:01.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:27:01 np0005596062 nova_compute[227313]: 2026-01-26 18:27:01.557 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:02 np0005596062 nova_compute[227313]: 2026-01-26 18:27:02.684 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:02.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:04.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:05 np0005596062 nova_compute[227313]: 2026-01-26 18:27:05.068 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:05 np0005596062 nova_compute[227313]: 2026-01-26 18:27:05.068 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:27:05 np0005596062 nova_compute[227313]: 2026-01-26 18:27:05.085 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:27:05 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:27:06 np0005596062 nova_compute[227313]: 2026-01-26 18:27:06.567 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:06.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:06.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.067 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.164 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.165 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.166 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.166 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.167 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:27:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:27:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:27:07 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4194753140' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.717 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.735 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.925 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.926 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4841MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.927 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:27:07 np0005596062 nova_compute[227313]: 2026-01-26 18:27:07.927 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.222 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.223 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.300 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.319 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.320 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.398 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.422 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:27:08 np0005596062 nova_compute[227313]: 2026-01-26 18:27:08.448 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:27:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:27:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:08.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:27:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:27:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:08.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:27:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:09.178 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:27:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:09.179 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:27:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:09.179 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:27:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:27:09 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3738025599' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:27:09 np0005596062 nova_compute[227313]: 2026-01-26 18:27:09.822 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:27:09 np0005596062 nova_compute[227313]: 2026-01-26 18:27:09.830 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:27:09 np0005596062 nova_compute[227313]: 2026-01-26 18:27:09.856 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:27:09 np0005596062 nova_compute[227313]: 2026-01-26 18:27:09.857 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:27:09 np0005596062 nova_compute[227313]: 2026-01-26 18:27:09.857 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:27:09 np0005596062 podman[249457]: 2026-01-26 18:27:09.891781655 +0000 UTC m=+0.103407619 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 26 13:27:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:10.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:10.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:11 np0005596062 nova_compute[227313]: 2026-01-26 18:27:11.572 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:11 np0005596062 nova_compute[227313]: 2026-01-26 18:27:11.840 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:11 np0005596062 nova_compute[227313]: 2026-01-26 18:27:11.841 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:12.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:12 np0005596062 nova_compute[227313]: 2026-01-26 18:27:12.721 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:12.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:13 np0005596062 nova_compute[227313]: 2026-01-26 18:27:13.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:13 np0005596062 nova_compute[227313]: 2026-01-26 18:27:13.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:27:13 np0005596062 nova_compute[227313]: 2026-01-26 18:27:13.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:27:13 np0005596062 nova_compute[227313]: 2026-01-26 18:27:13.079 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:27:13 np0005596062 nova_compute[227313]: 2026-01-26 18:27:13.079 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:13 np0005596062 nova_compute[227313]: 2026-01-26 18:27:13.079 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:27:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:14 np0005596062 nova_compute[227313]: 2026-01-26 18:27:14.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:14.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:27:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:14.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:27:15 np0005596062 nova_compute[227313]: 2026-01-26 18:27:15.047 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:15 np0005596062 nova_compute[227313]: 2026-01-26 18:27:15.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:16 np0005596062 ceph-mgr[77538]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2716354406
Jan 26 13:27:16 np0005596062 nova_compute[227313]: 2026-01-26 18:27:16.613 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:16.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:16.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:17 np0005596062 nova_compute[227313]: 2026-01-26 18:27:17.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:17 np0005596062 nova_compute[227313]: 2026-01-26 18:27:17.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:27:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:17.528 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:27:17 np0005596062 nova_compute[227313]: 2026-01-26 18:27:17.528 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:17 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:17.529 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:27:17 np0005596062 nova_compute[227313]: 2026-01-26 18:27:17.769 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:18.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:18.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:19.530 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:27:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:20.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:20.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:21 np0005596062 nova_compute[227313]: 2026-01-26 18:27:21.615 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:22.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:22 np0005596062 nova_compute[227313]: 2026-01-26 18:27:22.770 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:22.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:24.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:27:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:24.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:27:26 np0005596062 nova_compute[227313]: 2026-01-26 18:27:26.617 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:26.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:26.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:27 np0005596062 nova_compute[227313]: 2026-01-26 18:27:27.808 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:27 np0005596062 podman[249547]: 2026-01-26 18:27:27.895951268 +0000 UTC m=+0.066566560 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 13:27:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:28.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:28.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:30.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:30.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:31 np0005596062 nova_compute[227313]: 2026-01-26 18:27:31.619 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:32.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:32 np0005596062 nova_compute[227313]: 2026-01-26 18:27:32.812 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:32.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:34.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:34.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:36 np0005596062 nova_compute[227313]: 2026-01-26 18:27:36.621 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:36.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:36.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:37 np0005596062 nova_compute[227313]: 2026-01-26 18:27:37.815 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:38.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:38.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:27:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1257827261' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:27:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:27:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1257827261' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:27:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:40.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:40 np0005596062 podman[249622]: 2026-01-26 18:27:40.89288439 +0000 UTC m=+0.093773332 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:27:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:40.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:41 np0005596062 nova_compute[227313]: 2026-01-26 18:27:41.624 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:42.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:42 np0005596062 nova_compute[227313]: 2026-01-26 18:27:42.816 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:42.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:44.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:44.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:46 np0005596062 nova_compute[227313]: 2026-01-26 18:27:46.627 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:46.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:46.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:47 np0005596062 nova_compute[227313]: 2026-01-26 18:27:47.857 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:48.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:48.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:50.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:51 np0005596062 nova_compute[227313]: 2026-01-26 18:27:51.629 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:52.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:52 np0005596062 nova_compute[227313]: 2026-01-26 18:27:52.859 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:52.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:54.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:27:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:54.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:27:56 np0005596062 nova_compute[227313]: 2026-01-26 18:27:56.631 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:27:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:56.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:27:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:27:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:56.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:27:57 np0005596062 nova_compute[227313]: 2026-01-26 18:27:57.446 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:57 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:57.445 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:27:57 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:27:57.448 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:27:57 np0005596062 nova_compute[227313]: 2026-01-26 18:27:57.861 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:27:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:27:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:27:58.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:27:58 np0005596062 podman[249709]: 2026-01-26 18:27:58.864835928 +0000 UTC m=+0.078372163 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 13:27:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:27:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:27:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:27:58.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.003000080s ======
Jan 26 13:28:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:00.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 26 13:28:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:00.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:01 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:01.450 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:28:01 np0005596062 nova_compute[227313]: 2026-01-26 18:28:01.679 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:02 np0005596062 ceph-osd[79865]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 26 13:28:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:02.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:02 np0005596062 nova_compute[227313]: 2026-01-26 18:28:02.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:02.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:04.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:28:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:04.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:28:06 np0005596062 nova_compute[227313]: 2026-01-26 18:28:06.681 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:06.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:06.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:07 np0005596062 nova_compute[227313]: 2026-01-26 18:28:07.924 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:28:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:28:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:28:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:08.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:08.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.154 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.175 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.176 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.176 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.176 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.177 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:28:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:09.179 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:28:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:09.179 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:28:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:09.179 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:28:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:28:09 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3402287956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.624 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.818 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.820 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4827MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.821 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.821 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.902 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.903 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:28:09 np0005596062 nova_compute[227313]: 2026-01-26 18:28:09.938 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:28:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:28:10 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4067318210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:28:10 np0005596062 nova_compute[227313]: 2026-01-26 18:28:10.435 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:28:10 np0005596062 nova_compute[227313]: 2026-01-26 18:28:10.443 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:28:10 np0005596062 nova_compute[227313]: 2026-01-26 18:28:10.465 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:28:10 np0005596062 nova_compute[227313]: 2026-01-26 18:28:10.468 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:28:10 np0005596062 nova_compute[227313]: 2026-01-26 18:28:10.468 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:28:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:10.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:10.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:11 np0005596062 nova_compute[227313]: 2026-01-26 18:28:11.684 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:11 np0005596062 podman[249909]: 2026-01-26 18:28:11.923506839 +0000 UTC m=+0.128673810 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 26 13:28:12 np0005596062 nova_compute[227313]: 2026-01-26 18:28:12.366 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:12.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:12 np0005596062 nova_compute[227313]: 2026-01-26 18:28:12.957 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:12.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:13 np0005596062 nova_compute[227313]: 2026-01-26 18:28:13.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:13 np0005596062 nova_compute[227313]: 2026-01-26 18:28:13.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:13 np0005596062 nova_compute[227313]: 2026-01-26 18:28:13.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:28:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:14 np0005596062 nova_compute[227313]: 2026-01-26 18:28:14.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:14 np0005596062 nova_compute[227313]: 2026-01-26 18:28:14.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:28:14 np0005596062 nova_compute[227313]: 2026-01-26 18:28:14.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:28:14 np0005596062 nova_compute[227313]: 2026-01-26 18:28:14.069 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:28:14 np0005596062 nova_compute[227313]: 2026-01-26 18:28:14.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:28:14 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:28:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:14.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:14.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:16 np0005596062 nova_compute[227313]: 2026-01-26 18:28:16.066 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 26 13:28:16 np0005596062 nova_compute[227313]: 2026-01-26 18:28:16.686 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:16.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:16.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:17 np0005596062 nova_compute[227313]: 2026-01-26 18:28:17.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 26 13:28:17 np0005596062 nova_compute[227313]: 2026-01-26 18:28:17.959 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:18 np0005596062 nova_compute[227313]: 2026-01-26 18:28:18.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:18 np0005596062 nova_compute[227313]: 2026-01-26 18:28:18.064 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:28:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 26 13:28:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:18.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:18.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 26 13:28:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:20.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:20.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:21 np0005596062 nova_compute[227313]: 2026-01-26 18:28:21.688 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:28:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:22.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:28:22 np0005596062 nova_compute[227313]: 2026-01-26 18:28:22.963 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:22.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 26 13:28:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:24.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:24.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 26 13:28:26 np0005596062 nova_compute[227313]: 2026-01-26 18:28:26.691 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:26.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:26.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:27 np0005596062 nova_compute[227313]: 2026-01-26 18:28:27.965 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:28.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:28.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:29 np0005596062 podman[250044]: 2026-01-26 18:28:29.850750041 +0000 UTC m=+0.057005096 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:28:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:30.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:31 np0005596062 nova_compute[227313]: 2026-01-26 18:28:31.693 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 26 13:28:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:32.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:32 np0005596062 nova_compute[227313]: 2026-01-26 18:28:32.967 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:32.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 26 13:28:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:34.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4066303913' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4066303913' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3966895016' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:28:36 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3966895016' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:28:36 np0005596062 nova_compute[227313]: 2026-01-26 18:28:36.696 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:36.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:36.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:37 np0005596062 nova_compute[227313]: 2026-01-26 18:28:37.970 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:38.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:28:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:38.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:28:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:39.479 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:28:39 np0005596062 nova_compute[227313]: 2026-01-26 18:28:39.480 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:39.480 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:28:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 26 13:28:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:40.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:40.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:41 np0005596062 nova_compute[227313]: 2026-01-26 18:28:41.698 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:42.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:42 np0005596062 podman[250120]: 2026-01-26 18:28:42.905146005 +0000 UTC m=+0.105521364 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 13:28:42 np0005596062 nova_compute[227313]: 2026-01-26 18:28:42.973 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:42.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:44.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:44.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:46 np0005596062 nova_compute[227313]: 2026-01-26 18:28:46.701 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:28:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:46.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:28:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:28:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:47.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:28:47 np0005596062 nova_compute[227313]: 2026-01-26 18:28:47.974 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:48.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:49.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:49 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:28:49.482 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:28:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:50.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:51 np0005596062 nova_compute[227313]: 2026-01-26 18:28:51.704 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:52.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:52 np0005596062 nova_compute[227313]: 2026-01-26 18:28:52.975 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:53.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:54.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:55.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:56 np0005596062 nova_compute[227313]: 2026-01-26 18:28:56.706 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:56.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:28:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:57.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:28:57 np0005596062 nova_compute[227313]: 2026-01-26 18:28:57.978 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:28:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:28:58.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:28:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:28:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:28:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:28:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:28:59.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:00.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:00 np0005596062 podman[250205]: 2026-01-26 18:29:00.840587155 +0000 UTC m=+0.052773103 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:29:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:01.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:01 np0005596062 nova_compute[227313]: 2026-01-26 18:29:01.709 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:02.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:02 np0005596062 nova_compute[227313]: 2026-01-26 18:29:02.980 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:03.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:04 np0005596062 nova_compute[227313]: 2026-01-26 18:29:04.606 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:04 np0005596062 nova_compute[227313]: 2026-01-26 18:29:04.607 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:04 np0005596062 nova_compute[227313]: 2026-01-26 18:29:04.632 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:29:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:04.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:05.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.058 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.059 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.074 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.074 227317 INFO nova.compute.claims [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.243 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:29:05 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1433977840' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.720 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.732 227317 DEBUG nova.compute.provider_tree [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.775 227317 DEBUG nova.scheduler.client.report [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.818 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.820 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.873 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.874 227317 DEBUG nova.network.neutron [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.906 227317 INFO nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:29:05 np0005596062 nova_compute[227313]: 2026-01-26 18:29:05.952 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.092 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.093 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.094 227317 INFO nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Creating image(s)#033[00m
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.548377) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452146548475, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2441, "num_deletes": 255, "total_data_size": 5770723, "memory_usage": 5861760, "flush_reason": "Manual Compaction"}
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.565 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.607 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.640 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452146642076, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3781789, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36375, "largest_seqno": 38811, "table_properties": {"data_size": 3771916, "index_size": 6303, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20436, "raw_average_key_size": 20, "raw_value_size": 3752070, "raw_average_value_size": 3789, "num_data_blocks": 275, "num_entries": 990, "num_filter_entries": 990, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769451936, "oldest_key_time": 1769451936, "file_creation_time": 1769452146, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 93726 microseconds, and 9973 cpu microseconds.
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.642116) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3781789 bytes OK
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.642138) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.645030) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.645048) EVENT_LOG_v1 {"time_micros": 1769452146645041, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.645071) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5760077, prev total WAL file size 5760077, number of live WAL files 2.
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.645 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.646519) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3693KB)], [69(8862KB)]
Jan 26 13:29:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452146646575, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 12856740, "oldest_snapshot_seqno": -1}
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.713 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.721 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.722 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.722 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:06 np0005596062 nova_compute[227313]: 2026-01-26 18:29:06.723 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:06.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6267 keys, 10869087 bytes, temperature: kUnknown
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452147003674, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 10869087, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10826183, "index_size": 26154, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 159778, "raw_average_key_size": 25, "raw_value_size": 10712306, "raw_average_value_size": 1709, "num_data_blocks": 1054, "num_entries": 6267, "num_filter_entries": 6267, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452146, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.004309) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 10869087 bytes
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.009225) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 36.0 rd, 30.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 8.7 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(6.3) write-amplify(2.9) OK, records in: 6792, records dropped: 525 output_compression: NoCompression
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.009258) EVENT_LOG_v1 {"time_micros": 1769452147009242, "job": 42, "event": "compaction_finished", "compaction_time_micros": 357386, "compaction_time_cpu_micros": 30095, "output_level": 6, "num_output_files": 1, "total_output_size": 10869087, "num_input_records": 6792, "num_output_records": 6267, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452147011075, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452147014579, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:06.646390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.014665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.014673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.014676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.014679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:07 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:07.014682) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:07.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:07 np0005596062 nova_compute[227313]: 2026-01-26 18:29:07.033 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:07 np0005596062 nova_compute[227313]: 2026-01-26 18:29:07.039 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 54346782-8bd9-4542-b5be-744da7428268_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:07 np0005596062 nova_compute[227313]: 2026-01-26 18:29:07.643 227317 DEBUG nova.network.neutron [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Successfully created port: 3aeabb47-ce6c-439f-9d90-fecedc18e77f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:29:07 np0005596062 nova_compute[227313]: 2026-01-26 18:29:07.983 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:08 np0005596062 nova_compute[227313]: 2026-01-26 18:29:08.131 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 54346782-8bd9-4542-b5be-744da7428268_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:08 np0005596062 nova_compute[227313]: 2026-01-26 18:29:08.220 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] resizing rbd image 54346782-8bd9-4542-b5be-744da7428268_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:29:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:08.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.008 227317 DEBUG nova.network.neutron [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Successfully updated port: 3aeabb47-ce6c-439f-9d90-fecedc18e77f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:29:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:09.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.053 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "refresh_cache-54346782-8bd9-4542-b5be-744da7428268" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.054 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquired lock "refresh_cache-54346782-8bd9-4542-b5be-744da7428268" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.054 227317 DEBUG nova.network.neutron [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.147 227317 DEBUG nova.compute.manager [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-changed-3aeabb47-ce6c-439f-9d90-fecedc18e77f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.148 227317 DEBUG nova.compute.manager [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Refreshing instance network info cache due to event network-changed-3aeabb47-ce6c-439f-9d90-fecedc18e77f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.149 227317 DEBUG oslo_concurrency.lockutils [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-54346782-8bd9-4542-b5be-744da7428268" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:29:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:09.179 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:09.180 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:09.180 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.234 227317 DEBUG nova.objects.instance [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lazy-loading 'migration_context' on Instance uuid 54346782-8bd9-4542-b5be-744da7428268 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.271 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.271 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Ensure instance console log exists: /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.272 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.272 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.273 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:09 np0005596062 nova_compute[227313]: 2026-01-26 18:29:09.366 227317 DEBUG nova.network.neutron [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:29:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:10.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.006 227317 DEBUG nova.network.neutron [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Updating instance_info_cache with network_info: [{"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.029 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Releasing lock "refresh_cache-54346782-8bd9-4542-b5be-744da7428268" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.030 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Instance network_info: |[{"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.031 227317 DEBUG oslo_concurrency.lockutils [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-54346782-8bd9-4542-b5be-744da7428268" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.031 227317 DEBUG nova.network.neutron [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Refreshing network info cache for port 3aeabb47-ce6c-439f-9d90-fecedc18e77f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:29:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:11.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.035 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Start _get_guest_xml network_info=[{"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.040 227317 WARNING nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.050 227317 DEBUG nova.virt.libvirt.host [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.051 227317 DEBUG nova.virt.libvirt.host [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.060 227317 DEBUG nova.virt.libvirt.host [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.061 227317 DEBUG nova.virt.libvirt.host [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.064 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.065 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.065 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.066 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.066 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.067 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.067 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.067 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.068 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.068 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.068 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.069 227317 DEBUG nova.virt.hardware [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.074 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.113 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.114 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.114 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.114 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.115 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:29:11 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1039184944' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:29:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:29:11 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3305391432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.581 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.609 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.613 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.631 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.715 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.805 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.806 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4736MB free_disk=20.982471466064453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.807 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.808 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.976 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 54346782-8bd9-4542-b5be-744da7428268 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.977 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:29:11 np0005596062 nova_compute[227313]: 2026-01-26 18:29:11.977 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:29:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:29:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4141782849' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.025 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.027 227317 DEBUG nova.virt.libvirt.vif [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-717426979',display_name='tempest-TestServerMultinode-server-717426979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-717426979',id=17,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f2b1e48060904db7a7d629fffdaa921a',ramdisk_id='',reservation_id='r-bibtuuhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-128980879',owner_user_name='tempest-TestServerMultinode-128980879-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:29:05Z,user_data=None,user_id='87b6f2cd2d124de2be281e270184d195',uuid=54346782-8bd9-4542-b5be-744da7428268,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.028 227317 DEBUG nova.network.os_vif_util [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Converting VIF {"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.029 227317 DEBUG nova.network.os_vif_util [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.031 227317 DEBUG nova.objects.instance [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lazy-loading 'pci_devices' on Instance uuid 54346782-8bd9-4542-b5be-744da7428268 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.034 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.077 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <uuid>54346782-8bd9-4542-b5be-744da7428268</uuid>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <name>instance-00000011</name>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestServerMultinode-server-717426979</nova:name>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:29:11</nova:creationTime>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:user uuid="87b6f2cd2d124de2be281e270184d195">tempest-TestServerMultinode-128980879-project-admin</nova:user>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:project uuid="f2b1e48060904db7a7d629fffdaa921a">tempest-TestServerMultinode-128980879</nova:project>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <nova:port uuid="3aeabb47-ce6c-439f-9d90-fecedc18e77f">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <entry name="serial">54346782-8bd9-4542-b5be-744da7428268</entry>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <entry name="uuid">54346782-8bd9-4542-b5be-744da7428268</entry>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/54346782-8bd9-4542-b5be-744da7428268_disk">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/54346782-8bd9-4542-b5be-744da7428268_disk.config">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:be:b6:8b"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <target dev="tap3aeabb47-ce"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/console.log" append="off"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:29:12 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:29:12 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:29:12 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:29:12 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.079 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Preparing to wait for external event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.079 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.080 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.080 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.081 227317 DEBUG nova.virt.libvirt.vif [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-717426979',display_name='tempest-TestServerMultinode-server-717426979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-717426979',id=17,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f2b1e48060904db7a7d629fffdaa921a',ramdisk_id='',reservation_id='r-bibtuuhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-128980879',owner_user_name='tempest-TestServerMultinode-128980879-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:29:05Z,user_data=None,user_id='87b6f2cd2d124de2be281e270184d195',uuid=54346782-8bd9-4542-b5be-744da7428268,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.081 227317 DEBUG nova.network.os_vif_util [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Converting VIF {"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.082 227317 DEBUG nova.network.os_vif_util [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.082 227317 DEBUG os_vif [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.083 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.083 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.083 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.090 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.091 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3aeabb47-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.092 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3aeabb47-ce, col_values=(('external_ids', {'iface-id': '3aeabb47-ce6c-439f-9d90-fecedc18e77f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:b6:8b', 'vm-uuid': '54346782-8bd9-4542-b5be-744da7428268'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:12 np0005596062 NetworkManager[48993]: <info>  [1769452152.0956] manager: (tap3aeabb47-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.098 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.102 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.104 227317 INFO os_vif [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce')#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.255 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.255 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.256 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] No VIF found with MAC fa:16:3e:be:b6:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.256 227317 INFO nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Using config drive#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.284 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:29:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2361267280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.460 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.466 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.486 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.517 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.518 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.701 227317 INFO nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Creating config drive at /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/disk.config#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.707 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi2gcs6h4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:12.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.839 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi2gcs6h4" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.874 227317 DEBUG nova.storage.rbd_utils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] rbd image 54346782-8bd9-4542-b5be-744da7428268_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:29:12 np0005596062 nova_compute[227313]: 2026-01-26 18:29:12.878 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/disk.config 54346782-8bd9-4542-b5be-744da7428268_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:13.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.046 227317 DEBUG oslo_concurrency.processutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/disk.config 54346782-8bd9-4542-b5be-744da7428268_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.047 227317 INFO nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Deleting local config drive /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268/disk.config because it was imported into RBD.#033[00m
Jan 26 13:29:13 np0005596062 kernel: tap3aeabb47-ce: entered promiscuous mode
Jan 26 13:29:13 np0005596062 NetworkManager[48993]: <info>  [1769452153.0963] manager: (tap3aeabb47-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.096 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:13Z|00132|binding|INFO|Claiming lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f for this chassis.
Jan 26 13:29:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:13Z|00133|binding|INFO|3aeabb47-ce6c-439f-9d90-fecedc18e77f: Claiming fa:16:3e:be:b6:8b 10.100.0.10
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.100 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.103 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.115 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:b6:8b 10.100.0.10'], port_security=['fa:16:3e:be:b6:8b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '54346782-8bd9-4542-b5be-744da7428268', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2b1e48060904db7a7d629fffdaa921a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0c93d08d-c0a8-4947-b001-f618e8c0b8aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eb7435b-663a-4566-9286-29c15a28c76b, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3aeabb47-ce6c-439f-9d90-fecedc18e77f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.116 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3aeabb47-ce6c-439f-9d90-fecedc18e77f in datapath 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a bound to our chassis#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.117 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.135 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a1e4a9-5fe6-4590-8015-5df462cc5c5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.136 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f70dd9e-91 in ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.139 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f70dd9e-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.139 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[8192b9e2-d4e2-4c5b-a2d5-71215b265580]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.140 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ed30eb8f-44c9-48e8-bafa-1d8fad425fe2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 systemd[1]: Started Virtual Machine qemu-14-instance-00000011.
Jan 26 13:29:13 np0005596062 systemd-machined[195380]: New machine qemu-14-instance-00000011.
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.161 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[f775b5c2-463e-40dc-bc65-02b12c166448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.167 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:13Z|00134|binding|INFO|Setting lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f ovn-installed in OVS
Jan 26 13:29:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:13Z|00135|binding|INFO|Setting lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f up in Southbound
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.172 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 systemd-udevd[250608]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.181 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b20f69d3-28dc-4f93-83a7-cf6184b3c7d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 NetworkManager[48993]: <info>  [1769452153.1922] device (tap3aeabb47-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:29:13 np0005596062 NetworkManager[48993]: <info>  [1769452153.1933] device (tap3aeabb47-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.220 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[dc40fad2-87ae-4e5e-9b71-6386a60d44cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 systemd-udevd[250613]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:29:13 np0005596062 NetworkManager[48993]: <info>  [1769452153.2306] manager: (tap3f70dd9e-90): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.229 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[120a179c-6a3e-4ac0-b187-7b7e3e97835b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.261 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f5749bf5-976c-4d23-936a-c8247dc74728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.264 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[74d18192-9ee4-4d74-9677-f889512b60ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 podman[250593]: 2026-01-26 18:29:13.266746359 +0000 UTC m=+0.119981909 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 26 13:29:13 np0005596062 NetworkManager[48993]: <info>  [1769452153.2869] device (tap3f70dd9e-90): carrier: link connected
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.292 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[3722cb2a-1926-4174-b73e-9edfdb743ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.309 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a461f31e-6248-42f3-ba94-4748cc40665c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f70dd9e-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:13:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591333, 'reachable_time': 42916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250654, 'error': None, 'target': 'ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.326 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9681d1-e0dd-46ce-9f2d-801a58ff2e21]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:1396'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591333, 'tstamp': 591333}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250656, 'error': None, 'target': 'ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.346 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7c2450-8652-4651-a9c4-9119f012a2fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f70dd9e-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:13:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591333, 'reachable_time': 42916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250657, 'error': None, 'target': 'ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.385 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[00a4df98-0d4c-4f28-b1e4-bc0ec54a645c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.456 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4d4c76-3cca-49e1-81f4-f49b2aca34ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.458 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f70dd9e-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.459 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.459 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f70dd9e-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:13 np0005596062 kernel: tap3f70dd9e-90: entered promiscuous mode
Jan 26 13:29:13 np0005596062 NetworkManager[48993]: <info>  [1769452153.4619] manager: (tap3f70dd9e-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.461 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.463 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.464 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f70dd9e-90, col_values=(('external_ids', {'iface-id': 'c02a9bd5-7753-480e-86c4-d809dead851d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.465 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:13Z|00136|binding|INFO|Releasing lport c02a9bd5-7753-480e-86c4-d809dead851d from this chassis (sb_readonly=0)
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.478 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.480 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f70dd9e-997c-43d9-abf7-8ac842dc7a2a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f70dd9e-997c-43d9-abf7-8ac842dc7a2a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.481 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0c6571-0413-46b3-8c20-9dc8f8d6230d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.482 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/3f70dd9e-997c-43d9-abf7-8ac842dc7a2a.pid.haproxy
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:29:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:13.482 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'env', 'PROCESS_TAG=haproxy-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f70dd9e-997c-43d9-abf7-8ac842dc7a2a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.621 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452153.6209216, 54346782-8bd9-4542-b5be-744da7428268 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.622 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] VM Started (Lifecycle Event)#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.646 227317 DEBUG nova.network.neutron [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Updated VIF entry in instance network info cache for port 3aeabb47-ce6c-439f-9d90-fecedc18e77f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.646 227317 DEBUG nova.network.neutron [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Updating instance_info_cache with network_info: [{"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.649 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.653 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452153.6212327, 54346782-8bd9-4542-b5be-744da7428268 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.654 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.666 227317 DEBUG oslo_concurrency.lockutils [req-f6cdb59d-2548-4ca9-b9b5-09a2660030b8 req-4a343594-e785-4c3c-b1f3-39ba5fecb5ca 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-54346782-8bd9-4542-b5be-744da7428268" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.704 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.707 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:29:13 np0005596062 nova_compute[227313]: 2026-01-26 18:29:13.755 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:29:13 np0005596062 podman[250731]: 2026-01-26 18:29:13.863378122 +0000 UTC m=+0.028789866 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:29:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:14 np0005596062 podman[250731]: 2026-01-26 18:29:14.052455096 +0000 UTC m=+0.217866830 container create af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 26 13:29:14 np0005596062 systemd[1]: Started libpod-conmon-af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9.scope.
Jan 26 13:29:14 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:29:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/804ae73cf58afff25b3f36852db8f2c665e4b2ae290df4bd72c991dfe9ab325d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:29:14 np0005596062 podman[250731]: 2026-01-26 18:29:14.200803458 +0000 UTC m=+0.366215192 container init af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:29:14 np0005596062 podman[250731]: 2026-01-26 18:29:14.20877276 +0000 UTC m=+0.374184474 container start af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 26 13:29:14 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [NOTICE]   (250851) : New worker (250853) forked
Jan 26 13:29:14 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [NOTICE]   (250851) : Loading success.
Jan 26 13:29:14 np0005596062 nova_compute[227313]: 2026-01-26 18:29:14.518 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:14.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:15.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:15 np0005596062 nova_compute[227313]: 2026-01-26 18:29:15.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:15 np0005596062 nova_compute[227313]: 2026-01-26 18:29:15.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.052 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.079 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.080 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.081 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.535 227317 DEBUG nova.compute.manager [req-060a505c-fab0-4110-9424-f54cfbd1bee3 req-c1f9b325-00a0-40cc-bc21-c7ec6f7ea5d0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.535 227317 DEBUG oslo_concurrency.lockutils [req-060a505c-fab0-4110-9424-f54cfbd1bee3 req-c1f9b325-00a0-40cc-bc21-c7ec6f7ea5d0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.536 227317 DEBUG oslo_concurrency.lockutils [req-060a505c-fab0-4110-9424-f54cfbd1bee3 req-c1f9b325-00a0-40cc-bc21-c7ec6f7ea5d0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.536 227317 DEBUG oslo_concurrency.lockutils [req-060a505c-fab0-4110-9424-f54cfbd1bee3 req-c1f9b325-00a0-40cc-bc21-c7ec6f7ea5d0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.536 227317 DEBUG nova.compute.manager [req-060a505c-fab0-4110-9424-f54cfbd1bee3 req-c1f9b325-00a0-40cc-bc21-c7ec6f7ea5d0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Processing event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.537 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.545 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452156.5454266, 54346782-8bd9-4542-b5be-744da7428268 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.546 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.547 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.564 227317 INFO nova.virt.libvirt.driver [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] Instance spawned successfully.#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.564 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.579 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.583 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.598 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.599 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.599 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.600 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.600 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.600 227317 DEBUG nova.virt.libvirt.driver [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.627 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.672 227317 INFO nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Took 10.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.672 227317 DEBUG nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.717 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.737 227317 INFO nova.compute.manager [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Took 11.98 seconds to build instance.#033[00m
Jan 26 13:29:16 np0005596062 nova_compute[227313]: 2026-01-26 18:29:16.761 227317 DEBUG oslo_concurrency.lockutils [None req-eaacdf5d-1df8-4512-9ade-d27e890756d7 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:16.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:17.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:17 np0005596062 nova_compute[227313]: 2026-01-26 18:29:17.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:17 np0005596062 nova_compute[227313]: 2026-01-26 18:29:17.095 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.045 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.630 227317 DEBUG nova.compute.manager [req-c58a4fba-509b-470e-add9-5adc9675445c req-572c1732-7137-442e-902c-f91ead85b249 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.630 227317 DEBUG oslo_concurrency.lockutils [req-c58a4fba-509b-470e-add9-5adc9675445c req-572c1732-7137-442e-902c-f91ead85b249 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.630 227317 DEBUG oslo_concurrency.lockutils [req-c58a4fba-509b-470e-add9-5adc9675445c req-572c1732-7137-442e-902c-f91ead85b249 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.630 227317 DEBUG oslo_concurrency.lockutils [req-c58a4fba-509b-470e-add9-5adc9675445c req-572c1732-7137-442e-902c-f91ead85b249 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.630 227317 DEBUG nova.compute.manager [req-c58a4fba-509b-470e-add9-5adc9675445c req-572c1732-7137-442e-902c-f91ead85b249 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] No waiting events found dispatching network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:29:18 np0005596062 nova_compute[227313]: 2026-01-26 18:29:18.631 227317 WARNING nova.compute.manager [req-c58a4fba-509b-470e-add9-5adc9675445c req-572c1732-7137-442e-902c-f91ead85b249 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received unexpected event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f for instance with vm_state active and task_state None.#033[00m
Jan 26 13:29:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:18.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:19.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:29:19 np0005596062 nova_compute[227313]: 2026-01-26 18:29:19.922 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:19.922 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:29:19 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:19.924 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:29:20 np0005596062 nova_compute[227313]: 2026-01-26 18:29:20.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:29:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:20.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:21.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:21 np0005596062 nova_compute[227313]: 2026-01-26 18:29:21.720 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:22 np0005596062 nova_compute[227313]: 2026-01-26 18:29:22.097 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:22.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:23.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:24 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:29:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:24.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:25.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.330288) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452165330380, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 493, "num_deletes": 255, "total_data_size": 696401, "memory_usage": 706808, "flush_reason": "Manual Compaction"}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452165339190, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 448863, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38817, "largest_seqno": 39304, "table_properties": {"data_size": 446017, "index_size": 818, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6828, "raw_average_key_size": 19, "raw_value_size": 440310, "raw_average_value_size": 1226, "num_data_blocks": 34, "num_entries": 359, "num_filter_entries": 359, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452147, "oldest_key_time": 1769452147, "file_creation_time": 1769452165, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 8937 microseconds, and 2028 cpu microseconds.
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.339232) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 448863 bytes OK
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.339251) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.341723) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.341740) EVENT_LOG_v1 {"time_micros": 1769452165341734, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.341763) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 693343, prev total WAL file size 693343, number of live WAL files 2.
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.342292) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303035' seq:72057594037927935, type:22 .. '6C6F676D0031323536' seq:0, type:0; will stop at (end)
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(438KB)], [72(10MB)]
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452165342532, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 11317950, "oldest_snapshot_seqno": -1}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6102 keys, 11180228 bytes, temperature: kUnknown
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452165455908, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 11180228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11137541, "index_size": 26321, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15301, "raw_key_size": 157377, "raw_average_key_size": 25, "raw_value_size": 11025673, "raw_average_value_size": 1806, "num_data_blocks": 1058, "num_entries": 6102, "num_filter_entries": 6102, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452165, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.456168) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 11180228 bytes
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.460241) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 99.8 rd, 98.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.4 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(50.1) write-amplify(24.9) OK, records in: 6626, records dropped: 524 output_compression: NoCompression
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.460279) EVENT_LOG_v1 {"time_micros": 1769452165460265, "job": 44, "event": "compaction_finished", "compaction_time_micros": 113437, "compaction_time_cpu_micros": 33208, "output_level": 6, "num_output_files": 1, "total_output_size": 11180228, "num_input_records": 6626, "num_output_records": 6102, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452165460523, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452165462296, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.342225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.462332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.462337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.462338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.462340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:25 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:29:25.462342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:29:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:25.927 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:26 np0005596062 nova_compute[227313]: 2026-01-26 18:29:26.722 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:26.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:27.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:27 np0005596062 nova_compute[227313]: 2026-01-26 18:29:27.098 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.385 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.385 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.386 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.386 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.386 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.387 227317 INFO nova.compute.manager [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Terminating instance#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.388 227317 DEBUG nova.compute.manager [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:29:28 np0005596062 kernel: tap3aeabb47-ce (unregistering): left promiscuous mode
Jan 26 13:29:28 np0005596062 NetworkManager[48993]: <info>  [1769452168.5615] device (tap3aeabb47-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00137|binding|INFO|Releasing lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f from this chassis (sb_readonly=0)
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.572 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00138|binding|INFO|Setting lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f down in Southbound
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00139|binding|INFO|Removing iface tap3aeabb47-ce ovn-installed in OVS
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.576 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00140|binding|INFO|Releasing lport c02a9bd5-7753-480e-86c4-d809dead851d from this chassis (sb_readonly=0)
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.583 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:b6:8b 10.100.0.10'], port_security=['fa:16:3e:be:b6:8b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '54346782-8bd9-4542-b5be-744da7428268', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2b1e48060904db7a7d629fffdaa921a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c93d08d-c0a8-4947-b001-f618e8c0b8aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eb7435b-663a-4566-9286-29c15a28c76b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3aeabb47-ce6c-439f-9d90-fecedc18e77f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.584 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3aeabb47-ce6c-439f-9d90-fecedc18e77f in datapath 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a unbound from our chassis#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.585 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.586 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[57c52627-abba-49f3-871f-222c578a5dc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.587 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a namespace which is not needed anymore#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.592 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.643 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 26 13:29:28 np0005596062 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000011.scope: Consumed 12.479s CPU time.
Jan 26 13:29:28 np0005596062 systemd-machined[195380]: Machine qemu-14-instance-00000011 terminated.
Jan 26 13:29:28 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [NOTICE]   (250851) : haproxy version is 2.8.14-c23fe91
Jan 26 13:29:28 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [NOTICE]   (250851) : path to executable is /usr/sbin/haproxy
Jan 26 13:29:28 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [WARNING]  (250851) : Exiting Master process...
Jan 26 13:29:28 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [ALERT]    (250851) : Current worker (250853) exited with code 143 (Terminated)
Jan 26 13:29:28 np0005596062 neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a[250820]: [WARNING]  (250851) : All workers exited. Exiting... (0)
Jan 26 13:29:28 np0005596062 systemd[1]: libpod-af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9.scope: Deactivated successfully.
Jan 26 13:29:28 np0005596062 podman[251027]: 2026-01-26 18:29:28.732915759 +0000 UTC m=+0.046047495 container died af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 13:29:28 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9-userdata-shm.mount: Deactivated successfully.
Jan 26 13:29:28 np0005596062 systemd[1]: var-lib-containers-storage-overlay-804ae73cf58afff25b3f36852db8f2c665e4b2ae290df4bd72c991dfe9ab325d-merged.mount: Deactivated successfully.
Jan 26 13:29:28 np0005596062 podman[251027]: 2026-01-26 18:29:28.80785667 +0000 UTC m=+0.120988396 container cleanup af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 13:29:28 np0005596062 kernel: tap3aeabb47-ce: entered promiscuous mode
Jan 26 13:29:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:28.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:28 np0005596062 NetworkManager[48993]: <info>  [1769452168.8134] manager: (tap3aeabb47-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Jan 26 13:29:28 np0005596062 kernel: tap3aeabb47-ce (unregistering): left promiscuous mode
Jan 26 13:29:28 np0005596062 systemd[1]: libpod-conmon-af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9.scope: Deactivated successfully.
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00141|binding|INFO|Claiming lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f for this chassis.
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00142|binding|INFO|3aeabb47-ce6c-439f-9d90-fecedc18e77f: Claiming fa:16:3e:be:b6:8b 10.100.0.10
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.821 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.842 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:b6:8b 10.100.0.10'], port_security=['fa:16:3e:be:b6:8b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '54346782-8bd9-4542-b5be-744da7428268', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2b1e48060904db7a7d629fffdaa921a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c93d08d-c0a8-4947-b001-f618e8c0b8aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eb7435b-663a-4566-9286-29c15a28c76b, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3aeabb47-ce6c-439f-9d90-fecedc18e77f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.845 227317 INFO nova.virt.libvirt.driver [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] Instance destroyed successfully.#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.846 227317 DEBUG nova.objects.instance [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lazy-loading 'resources' on Instance uuid 54346782-8bd9-4542-b5be-744da7428268 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.861 227317 DEBUG nova.virt.libvirt.vif [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:29:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-717426979',display_name='tempest-TestServerMultinode-server-717426979',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-717426979',id=17,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:29:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f2b1e48060904db7a7d629fffdaa921a',ramdisk_id='',reservation_id='r-bibtuuhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-128980879',owner_user_name='tempest-TestServerMultinode-128980879-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:29:16Z,user_data=None,user_id='87b6f2cd2d124de2be281e270184d195',uuid=54346782-8bd9-4542-b5be-744da7428268,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.861 227317 DEBUG nova.network.os_vif_util [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Converting VIF {"id": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "address": "fa:16:3e:be:b6:8b", "network": {"id": "3f70dd9e-997c-43d9-abf7-8ac842dc7a2a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1075445344-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dd033a95e4c454f82b471fb31b8c978", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3aeabb47-ce", "ovs_interfaceid": "3aeabb47-ce6c-439f-9d90-fecedc18e77f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.862 227317 DEBUG nova.network.os_vif_util [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.862 227317 DEBUG os_vif [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.864 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3aeabb47-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.866 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 ovn_controller[133984]: 2026-01-26T18:29:28Z|00143|binding|INFO|Releasing lport 3aeabb47-ce6c-439f-9d90-fecedc18e77f from this chassis (sb_readonly=0)
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.868 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.874 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:b6:8b 10.100.0.10'], port_security=['fa:16:3e:be:b6:8b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '54346782-8bd9-4542-b5be-744da7428268', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f2b1e48060904db7a7d629fffdaa921a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c93d08d-c0a8-4947-b001-f618e8c0b8aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eb7435b-663a-4566-9286-29c15a28c76b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3aeabb47-ce6c-439f-9d90-fecedc18e77f) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.899 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.902 227317 INFO os_vif [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:b6:8b,bridge_name='br-int',has_traffic_filtering=True,id=3aeabb47-ce6c-439f-9d90-fecedc18e77f,network=Network(3f70dd9e-997c-43d9-abf7-8ac842dc7a2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3aeabb47-ce')#033[00m
Jan 26 13:29:28 np0005596062 podman[251060]: 2026-01-26 18:29:28.907146318 +0000 UTC m=+0.061668159 container remove af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.914 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[3eaae372-4d86-41b3-b07f-ad58b2638f04]: (4, ('Mon Jan 26 06:29:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a (af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9)\naf97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9\nMon Jan 26 06:29:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a (af97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9)\naf97312180d773df040197c87b132bcc812c037ec420baa71e5bda9aa49d54e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.917 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1d8edb-b598-4e62-b1f7-61a5fe271495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.918 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f70dd9e-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:29:28 np0005596062 kernel: tap3f70dd9e-90: left promiscuous mode
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.926 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 nova_compute[227313]: 2026-01-26 18:29:28.935 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.935 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[24ebf629-5d52-4d67-9b31-e81e528da494]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.949 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[55e13c94-6f52-4944-ab14-f408738a8884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.951 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9238fa-7107-4e91-8537-bf1b0789c960]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.968 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ce9be3-bcb2-4194-945b-8642e2a4f84f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591326, 'reachable_time': 38764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251102, 'error': None, 'target': 'ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 systemd[1]: run-netns-ovnmeta\x2d3f70dd9e\x2d997c\x2d43d9\x2dabf7\x2d8ac842dc7a2a.mount: Deactivated successfully.
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.973 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f70dd9e-997c-43d9-abf7-8ac842dc7a2a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.974 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9032d9-dd7c-4ffe-99aa-ce1e5bcf1897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.975 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3aeabb47-ce6c-439f-9d90-fecedc18e77f in datapath 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a unbound from our chassis#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.976 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.977 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2c8e0d-1a40-4277-838b-3b258f315aa4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.978 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3aeabb47-ce6c-439f-9d90-fecedc18e77f in datapath 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a unbound from our chassis#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.979 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f70dd9e-997c-43d9-abf7-8ac842dc7a2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:29:28 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:29:28.979 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fa303f81-10e1-40db-899f-5dfcbb94b1f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:29:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:29.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:29 np0005596062 nova_compute[227313]: 2026-01-26 18:29:29.199 227317 DEBUG nova.compute.manager [req-a8311219-48cf-4757-9b42-1b6d4ae3ff42 req-db65886b-3e17-4806-9f61-324836279b68 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-vif-unplugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:29:29 np0005596062 nova_compute[227313]: 2026-01-26 18:29:29.200 227317 DEBUG oslo_concurrency.lockutils [req-a8311219-48cf-4757-9b42-1b6d4ae3ff42 req-db65886b-3e17-4806-9f61-324836279b68 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:29 np0005596062 nova_compute[227313]: 2026-01-26 18:29:29.200 227317 DEBUG oslo_concurrency.lockutils [req-a8311219-48cf-4757-9b42-1b6d4ae3ff42 req-db65886b-3e17-4806-9f61-324836279b68 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:29 np0005596062 nova_compute[227313]: 2026-01-26 18:29:29.200 227317 DEBUG oslo_concurrency.lockutils [req-a8311219-48cf-4757-9b42-1b6d4ae3ff42 req-db65886b-3e17-4806-9f61-324836279b68 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:29 np0005596062 nova_compute[227313]: 2026-01-26 18:29:29.201 227317 DEBUG nova.compute.manager [req-a8311219-48cf-4757-9b42-1b6d4ae3ff42 req-db65886b-3e17-4806-9f61-324836279b68 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] No waiting events found dispatching network-vif-unplugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:29:29 np0005596062 nova_compute[227313]: 2026-01-26 18:29:29.201 227317 DEBUG nova.compute.manager [req-a8311219-48cf-4757-9b42-1b6d4ae3ff42 req-db65886b-3e17-4806-9f61-324836279b68 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-vif-unplugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:29:30 np0005596062 nova_compute[227313]: 2026-01-26 18:29:30.445 227317 INFO nova.virt.libvirt.driver [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Deleting instance files /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268_del#033[00m
Jan 26 13:29:30 np0005596062 nova_compute[227313]: 2026-01-26 18:29:30.446 227317 INFO nova.virt.libvirt.driver [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Deletion of /var/lib/nova/instances/54346782-8bd9-4542-b5be-744da7428268_del complete#033[00m
Jan 26 13:29:30 np0005596062 nova_compute[227313]: 2026-01-26 18:29:30.514 227317 INFO nova.compute.manager [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Took 2.13 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:29:30 np0005596062 nova_compute[227313]: 2026-01-26 18:29:30.515 227317 DEBUG oslo.service.loopingcall [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:29:30 np0005596062 nova_compute[227313]: 2026-01-26 18:29:30.515 227317 DEBUG nova.compute.manager [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:29:30 np0005596062 nova_compute[227313]: 2026-01-26 18:29:30.515 227317 DEBUG nova.network.neutron [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:29:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:30.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:31.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.643 227317 DEBUG nova.compute.manager [req-a6b9c060-b1ba-4809-8f80-2083e72a1c7f req-50307a89-ebd2-4141-9c0c-20159676c93d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.643 227317 DEBUG oslo_concurrency.lockutils [req-a6b9c060-b1ba-4809-8f80-2083e72a1c7f req-50307a89-ebd2-4141-9c0c-20159676c93d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "54346782-8bd9-4542-b5be-744da7428268-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.643 227317 DEBUG oslo_concurrency.lockutils [req-a6b9c060-b1ba-4809-8f80-2083e72a1c7f req-50307a89-ebd2-4141-9c0c-20159676c93d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.643 227317 DEBUG oslo_concurrency.lockutils [req-a6b9c060-b1ba-4809-8f80-2083e72a1c7f req-50307a89-ebd2-4141-9c0c-20159676c93d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.644 227317 DEBUG nova.compute.manager [req-a6b9c060-b1ba-4809-8f80-2083e72a1c7f req-50307a89-ebd2-4141-9c0c-20159676c93d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] No waiting events found dispatching network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.644 227317 WARNING nova.compute.manager [req-a6b9c060-b1ba-4809-8f80-2083e72a1c7f req-50307a89-ebd2-4141-9c0c-20159676c93d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received unexpected event network-vif-plugged-3aeabb47-ce6c-439f-9d90-fecedc18e77f for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:29:31 np0005596062 nova_compute[227313]: 2026-01-26 18:29:31.723 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:31 np0005596062 podman[251106]: 2026-01-26 18:29:31.865992708 +0000 UTC m=+0.068015368 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 13:29:32 np0005596062 nova_compute[227313]: 2026-01-26 18:29:32.415 227317 DEBUG nova.network.neutron [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:29:32 np0005596062 nova_compute[227313]: 2026-01-26 18:29:32.472 227317 INFO nova.compute.manager [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] Took 1.96 seconds to deallocate network for instance.#033[00m
Jan 26 13:29:32 np0005596062 nova_compute[227313]: 2026-01-26 18:29:32.598 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:32 np0005596062 nova_compute[227313]: 2026-01-26 18:29:32.599 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:32.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:32 np0005596062 nova_compute[227313]: 2026-01-26 18:29:32.877 227317 DEBUG oslo_concurrency.processutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:33.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.261 227317 DEBUG nova.compute.manager [req-bc9a6bab-3030-4ff4-97cd-68a1da650700 req-c4d86e12-672c-468e-bffa-0285f995e697 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 54346782-8bd9-4542-b5be-744da7428268] Received event network-vif-deleted-3aeabb47-ce6c-439f-9d90-fecedc18e77f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:29:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:29:33 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/517016434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.351 227317 DEBUG oslo_concurrency.processutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.360 227317 DEBUG nova.compute.provider_tree [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.474 227317 DEBUG nova.scheduler.client.report [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.549 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.600 227317 INFO nova.scheduler.client.report [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Deleted allocations for instance 54346782-8bd9-4542-b5be-744da7428268#033[00m
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.720 227317 DEBUG oslo_concurrency.lockutils [None req-1ce456d0-235b-4ec9-8da9-712a418bb13b 87b6f2cd2d124de2be281e270184d195 f2b1e48060904db7a7d629fffdaa921a - - default default] Lock "54346782-8bd9-4542-b5be-744da7428268" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:33 np0005596062 nova_compute[227313]: 2026-01-26 18:29:33.867 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:34.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:35.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:36 np0005596062 nova_compute[227313]: 2026-01-26 18:29:36.725 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:36.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:37.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:38.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:38 np0005596062 nova_compute[227313]: 2026-01-26 18:29:38.871 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:39.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:40.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:41.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:41 np0005596062 nova_compute[227313]: 2026-01-26 18:29:41.760 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:42.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:43.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:43 np0005596062 nova_compute[227313]: 2026-01-26 18:29:43.841 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452168.8402894, 54346782-8bd9-4542-b5be-744da7428268 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:29:43 np0005596062 nova_compute[227313]: 2026-01-26 18:29:43.842 227317 INFO nova.compute.manager [-] [instance: 54346782-8bd9-4542-b5be-744da7428268] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:29:43 np0005596062 nova_compute[227313]: 2026-01-26 18:29:43.869 227317 DEBUG nova.compute.manager [None req-3fd9f550-dc1c-4885-b248-b0fae3a9707b - - - - - -] [instance: 54346782-8bd9-4542-b5be-744da7428268] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:29:43 np0005596062 nova_compute[227313]: 2026-01-26 18:29:43.923 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:43 np0005596062 podman[251205]: 2026-01-26 18:29:43.943622728 +0000 UTC m=+0.144380067 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 26 13:29:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:44 np0005596062 nova_compute[227313]: 2026-01-26 18:29:44.563 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:44.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:45.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 26 13:29:46 np0005596062 nova_compute[227313]: 2026-01-26 18:29:46.762 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:46.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:47.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:48.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:48 np0005596062 nova_compute[227313]: 2026-01-26 18:29:48.925 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:49.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:50.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:51.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:51 np0005596062 nova_compute[227313]: 2026-01-26 18:29:51.806 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:52.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:53.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:53 np0005596062 nova_compute[227313]: 2026-01-26 18:29:53.928 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:54 np0005596062 nova_compute[227313]: 2026-01-26 18:29:54.434 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:54 np0005596062 nova_compute[227313]: 2026-01-26 18:29:54.434 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:54 np0005596062 nova_compute[227313]: 2026-01-26 18:29:54.558 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:29:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:29:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:54.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:29:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:55.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:56 np0005596062 nova_compute[227313]: 2026-01-26 18:29:56.331 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:29:56 np0005596062 nova_compute[227313]: 2026-01-26 18:29:56.331 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:29:56 np0005596062 nova_compute[227313]: 2026-01-26 18:29:56.339 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:29:56 np0005596062 nova_compute[227313]: 2026-01-26 18:29:56.340 227317 INFO nova.compute.claims [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:29:56 np0005596062 nova_compute[227313]: 2026-01-26 18:29:56.809 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:56.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:57.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:57 np0005596062 nova_compute[227313]: 2026-01-26 18:29:57.511 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:29:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:29:57 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/331470855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:29:57 np0005596062 nova_compute[227313]: 2026-01-26 18:29:57.928 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:29:57 np0005596062 nova_compute[227313]: 2026-01-26 18:29:57.935 227317 DEBUG nova.compute.provider_tree [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:29:58 np0005596062 nova_compute[227313]: 2026-01-26 18:29:58.385 227317 DEBUG nova.scheduler.client.report [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:29:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:29:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:29:58.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:29:58 np0005596062 nova_compute[227313]: 2026-01-26 18:29:58.931 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:29:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:29:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:29:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:29:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:29:59.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:29:59 np0005596062 nova_compute[227313]: 2026-01-26 18:29:59.851 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:29:59 np0005596062 nova_compute[227313]: 2026-01-26 18:29:59.852 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:30:00 np0005596062 nova_compute[227313]: 2026-01-26 18:30:00.010 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:30:00 np0005596062 nova_compute[227313]: 2026-01-26 18:30:00.011 227317 DEBUG nova.network.neutron [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:30:00 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:30:00 np0005596062 nova_compute[227313]: 2026-01-26 18:30:00.321 227317 INFO nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:30:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:00.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:01.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:01 np0005596062 nova_compute[227313]: 2026-01-26 18:30:01.255 227317 DEBUG nova.policy [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1859ed83e26a48fdadcb5b9899dae46e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b158396183b64160b56d3c4df4ae6550', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:30:01 np0005596062 nova_compute[227313]: 2026-01-26 18:30:01.812 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:02 np0005596062 nova_compute[227313]: 2026-01-26 18:30:02.047 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:30:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:30:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:02.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:30:02 np0005596062 podman[251316]: 2026-01-26 18:30:02.873957034 +0000 UTC m=+0.070502844 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:30:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:03.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:03 np0005596062 nova_compute[227313]: 2026-01-26 18:30:03.514 227317 INFO nova.virt.block_device [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Booting with volume snapshot 6de02530-297d-4b9e-9457-7a21d7db5b63 at /dev/vda#033[00m
Jan 26 13:30:03 np0005596062 nova_compute[227313]: 2026-01-26 18:30:03.934 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:04.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:30:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:05.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:30:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:05.634 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:30:05 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:05.635 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:30:05 np0005596062 nova_compute[227313]: 2026-01-26 18:30:05.636 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:06 np0005596062 nova_compute[227313]: 2026-01-26 18:30:06.271 227317 DEBUG nova.network.neutron [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Successfully created port: f073758b-10b1-4f5b-9fd0-04c5020c56dc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:30:06 np0005596062 nova_compute[227313]: 2026-01-26 18:30:06.813 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:06.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:07.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:07 np0005596062 nova_compute[227313]: 2026-01-26 18:30:07.905 227317 DEBUG nova.network.neutron [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Successfully updated port: f073758b-10b1-4f5b-9fd0-04c5020c56dc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:30:07 np0005596062 nova_compute[227313]: 2026-01-26 18:30:07.924 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "refresh_cache-7248159b-fc8f-4676-ae14-d348d1874528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:30:07 np0005596062 nova_compute[227313]: 2026-01-26 18:30:07.925 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquired lock "refresh_cache-7248159b-fc8f-4676-ae14-d348d1874528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:30:07 np0005596062 nova_compute[227313]: 2026-01-26 18:30:07.925 227317 DEBUG nova.network.neutron [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:30:08 np0005596062 nova_compute[227313]: 2026-01-26 18:30:08.114 227317 DEBUG nova.compute.manager [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-changed-f073758b-10b1-4f5b-9fd0-04c5020c56dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:30:08 np0005596062 nova_compute[227313]: 2026-01-26 18:30:08.114 227317 DEBUG nova.compute.manager [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Refreshing instance network info cache due to event network-changed-f073758b-10b1-4f5b-9fd0-04c5020c56dc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:30:08 np0005596062 nova_compute[227313]: 2026-01-26 18:30:08.115 227317 DEBUG oslo_concurrency.lockutils [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-7248159b-fc8f-4676-ae14-d348d1874528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:30:08 np0005596062 nova_compute[227313]: 2026-01-26 18:30:08.220 227317 DEBUG nova.network.neutron [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:30:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:08.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:08 np0005596062 nova_compute[227313]: 2026-01-26 18:30:08.937 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:09.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:09.180 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:09.181 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:09.181 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:09 np0005596062 nova_compute[227313]: 2026-01-26 18:30:09.288 227317 DEBUG nova.network.neutron [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Updating instance_info_cache with network_info: [{"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:30:09 np0005596062 nova_compute[227313]: 2026-01-26 18:30:09.309 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Releasing lock "refresh_cache-7248159b-fc8f-4676-ae14-d348d1874528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:30:09 np0005596062 nova_compute[227313]: 2026-01-26 18:30:09.310 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Instance network_info: |[{"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:30:09 np0005596062 nova_compute[227313]: 2026-01-26 18:30:09.310 227317 DEBUG oslo_concurrency.lockutils [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-7248159b-fc8f-4676-ae14-d348d1874528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:30:09 np0005596062 nova_compute[227313]: 2026-01-26 18:30:09.311 227317 DEBUG nova.network.neutron [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Refreshing network info cache for port f073758b-10b1-4f5b-9fd0-04c5020c56dc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.117 227317 DEBUG os_brick.utils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.118 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.129 232828 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.130 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[bda4ec66-6f40-4e4b-956d-4f2cf6bb6b16]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.131 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.137 232828 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.138 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[c4bca5a2-733b-4928-aadb-c651c634e392]: (4, ('InitiatorName=iqn.1994-05.com.redhat:c828cff26df4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.139 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.147 232828 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.147 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[b90a2e86-b0e2-4fd4-b508-fac4f7e60c62]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.148 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[217855fd-7592-4f14-aa58-98d57f1adc09]: (4, '5c33c4b0-14ac-46af-8c94-d3bb1b6300af') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.149 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.171 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.174 227317 DEBUG os_brick.initiator.connectors.lightos [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.174 227317 DEBUG os_brick.initiator.connectors.lightos [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.175 227317 DEBUG os_brick.initiator.connectors.lightos [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.175 227317 DEBUG os_brick.utils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] <== get_connector_properties: return (57ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:c828cff26df4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5c33c4b0-14ac-46af-8c94-d3bb1b6300af', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 26 13:30:10 np0005596062 nova_compute[227313]: 2026-01-26 18:30:10.175 227317 DEBUG nova.virt.block_device [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Updating existing volume attachment record: 9f34a6fa-add7-4516-9225-6a9813df84b9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 26 13:30:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:10.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:11.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:11.639 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:11 np0005596062 nova_compute[227313]: 2026-01-26 18:30:11.816 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.300 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.301 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.301 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.301 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.302 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:30:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1311018443' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.726 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:12.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.887 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.888 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4790MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.888 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:12 np0005596062 nova_compute[227313]: 2026-01-26 18:30:12.888 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:13.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.211 227317 DEBUG nova.network.neutron [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Updated VIF entry in instance network info cache for port f073758b-10b1-4f5b-9fd0-04c5020c56dc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.212 227317 DEBUG nova.network.neutron [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Updating instance_info_cache with network_info: [{"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.228 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 7248159b-fc8f-4676-ae14-d348d1874528 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.229 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.230 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.245 227317 DEBUG oslo_concurrency.lockutils [req-bddb05a5-794a-406f-8859-764de35b9094 req-4d4c3af0-5556-4daa-aabf-ead60cd445ef 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-7248159b-fc8f-4676-ae14-d348d1874528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.249 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.251 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.251 227317 INFO nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Creating image(s)#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.251 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.252 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Ensure instance console log exists: /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.252 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.252 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.253 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.255 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Start _get_guest_xml network_info=[{"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/vda', 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'attachment_id': '9f34a6fa-add7-4516-9225-6a9813df84b9', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-92cd157b-154d-425b-bab9-ce127490a9ca', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '92cd157b-154d-425b-bab9-ce127490a9ca', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '7248159b-fc8f-4676-ae14-d348d1874528', 'attached_at': '', 'detached_at': '', 'volume_id': '92cd157b-154d-425b-bab9-ce127490a9ca', 'serial': '92cd157b-154d-425b-bab9-ce127490a9ca'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.262 227317 WARNING nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.270 227317 DEBUG nova.virt.libvirt.host [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.271 227317 DEBUG nova.virt.libvirt.host [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.273 227317 DEBUG nova.virt.libvirt.host [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.274 227317 DEBUG nova.virt.libvirt.host [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.275 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.275 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.275 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.276 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.276 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.276 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.276 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.277 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.277 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.277 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.277 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.277 227317 DEBUG nova.virt.hardware [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.307 227317 DEBUG nova.storage.rbd_utils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] rbd image 7248159b-fc8f-4676-ae14-d348d1874528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.311 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.333 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2186423683' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3837755243' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3107260326' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.764 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.774 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.780 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.807 227317 DEBUG nova.virt.libvirt.vif [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-733700910',display_name='tempest-TestVolumeBootPattern-server-733700910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-733700910',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b158396183b64160b56d3c4df4ae6550',ramdisk_id='',reservation_id='r-cmy0eofn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200254346',owner_user_name='tempest-TestVolumeBootPattern-1200254346-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:30:03Z,user_data=None,user_id='1859ed83e26a48fdadcb5b9899dae46e',uuid=7248159b-fc8f-4676-ae14-d348d1874528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.808 227317 DEBUG nova.network.os_vif_util [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converting VIF {"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.809 227317 DEBUG nova.network.os_vif_util [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.810 227317 DEBUG nova.objects.instance [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7248159b-fc8f-4676-ae14-d348d1874528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.813 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.829 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <uuid>7248159b-fc8f-4676-ae14-d348d1874528</uuid>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <name>instance-00000013</name>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestVolumeBootPattern-server-733700910</nova:name>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:30:13</nova:creationTime>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:user uuid="1859ed83e26a48fdadcb5b9899dae46e">tempest-TestVolumeBootPattern-1200254346-project-member</nova:user>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:project uuid="b158396183b64160b56d3c4df4ae6550">tempest-TestVolumeBootPattern-1200254346</nova:project>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <nova:port uuid="f073758b-10b1-4f5b-9fd0-04c5020c56dc">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <entry name="serial">7248159b-fc8f-4676-ae14-d348d1874528</entry>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <entry name="uuid">7248159b-fc8f-4676-ae14-d348d1874528</entry>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/7248159b-fc8f-4676-ae14-d348d1874528_disk.config">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="volumes/volume-92cd157b-154d-425b-bab9-ce127490a9ca">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <serial>92cd157b-154d-425b-bab9-ce127490a9ca</serial>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:fb:80:6d"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <target dev="tapf073758b-10"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/console.log" append="off"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:30:13 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:30:13 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:30:13 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:30:13 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.831 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Preparing to wait for external event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.831 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.832 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.832 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.833 227317 DEBUG nova.virt.libvirt.vif [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-733700910',display_name='tempest-TestVolumeBootPattern-server-733700910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-733700910',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b158396183b64160b56d3c4df4ae6550',ramdisk_id='',reservation_id='r-cmy0eofn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200254346',owner_user_name='tempest-TestVolumeBootPattern-1200254346-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:30:03Z,user_data=None,user_id='1859ed83e26a48fdadcb5b9899dae46e',uuid=7248159b-fc8f-4676-ae14-d348d1874528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.833 227317 DEBUG nova.network.os_vif_util [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converting VIF {"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.834 227317 DEBUG nova.network.os_vif_util [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.834 227317 DEBUG os_vif [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.834 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.835 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.835 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.838 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.838 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf073758b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.839 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf073758b-10, col_values=(('external_ids', {'iface-id': 'f073758b-10b1-4f5b-9fd0-04c5020c56dc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:80:6d', 'vm-uuid': '7248159b-fc8f-4676-ae14-d348d1874528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.840 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:13 np0005596062 NetworkManager[48993]: <info>  [1769452213.8420] manager: (tapf073758b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.843 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.847 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.848 227317 INFO os_vif [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10')#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.852 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.853 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.905 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.906 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.906 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] No VIF found with MAC fa:16:3e:fb:80:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.907 227317 INFO nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Using config drive#033[00m
Jan 26 13:30:13 np0005596062 nova_compute[227313]: 2026-01-26 18:30:13.929 227317 DEBUG nova.storage.rbd_utils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] rbd image 7248159b-fc8f-4676-ae14-d348d1874528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:30:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.268 227317 INFO nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Creating config drive at /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/disk.config#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.273 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyripo6fu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.402 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyripo6fu" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.428 227317 DEBUG nova.storage.rbd_utils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] rbd image 7248159b-fc8f-4676-ae14-d348d1874528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.432 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/disk.config 7248159b-fc8f-4676-ae14-d348d1874528_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.730 227317 DEBUG oslo_concurrency.processutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/disk.config 7248159b-fc8f-4676-ae14-d348d1874528_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.732 227317 INFO nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Deleting local config drive /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528/disk.config because it was imported into RBD.#033[00m
Jan 26 13:30:14 np0005596062 kernel: tapf073758b-10: entered promiscuous mode
Jan 26 13:30:14 np0005596062 NetworkManager[48993]: <info>  [1769452214.7835] manager: (tapf073758b-10): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.784 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:14Z|00144|binding|INFO|Claiming lport f073758b-10b1-4f5b-9fd0-04c5020c56dc for this chassis.
Jan 26 13:30:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:14Z|00145|binding|INFO|f073758b-10b1-4f5b-9fd0-04c5020c56dc: Claiming fa:16:3e:fb:80:6d 10.100.0.14
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.791 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.805 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:80:6d 10.100.0.14'], port_security=['fa:16:3e:fb:80:6d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7248159b-fc8f-4676-ae14-d348d1874528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6ffe169-5606-433e-936f-c0a2554b460d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b158396183b64160b56d3c4df4ae6550', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2073c991-7fbf-4f8f-a63c-07abc42a736f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd476994-76c7-4ad1-88ba-247776af23a7, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=f073758b-10b1-4f5b-9fd0-04c5020c56dc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.806 143929 INFO neutron.agent.ovn.metadata.agent [-] Port f073758b-10b1-4f5b-9fd0-04c5020c56dc in datapath a6ffe169-5606-433e-936f-c0a2554b460d bound to our chassis#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.808 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6ffe169-5606-433e-936f-c0a2554b460d#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.819 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4acf41-0aa7-4902-96e7-de25d9736f5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.820 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6ffe169-51 in ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.823 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6ffe169-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.823 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4bebc480-7b3e-4747-93a3-3eb8d5fdb2a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 systemd-udevd[251515]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.823 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c5584c96-2d62-4b69-b3ff-b2bc763d08ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 systemd-machined[195380]: New machine qemu-15-instance-00000013.
Jan 26 13:30:14 np0005596062 NetworkManager[48993]: <info>  [1769452214.8431] device (tapf073758b-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:30:14 np0005596062 NetworkManager[48993]: <info>  [1769452214.8436] device (tapf073758b-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.844 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[242ea132-9df4-47ef-bb70-fba9a641319a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.853 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.854 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.858 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.860 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a67c4d6d-1db6-4fa3-a99f-2a9fd94df132]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 systemd[1]: Started Virtual Machine qemu-15-instance-00000013.
Jan 26 13:30:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:14.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:14Z|00146|binding|INFO|Setting lport f073758b-10b1-4f5b-9fd0-04c5020c56dc ovn-installed in OVS
Jan 26 13:30:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:14Z|00147|binding|INFO|Setting lport f073758b-10b1-4f5b-9fd0-04c5020c56dc up in Southbound
Jan 26 13:30:14 np0005596062 nova_compute[227313]: 2026-01-26 18:30:14.868 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.896 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[29fc7e7c-22b9-478e-baa7-ff8ed18aaa55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.902 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d044a2-4e5f-4e1e-bd03-cb835f2beb4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 NetworkManager[48993]: <info>  [1769452214.9033] manager: (tapa6ffe169-50): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 26 13:30:14 np0005596062 systemd-udevd[251525]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:30:14 np0005596062 podman[251498]: 2026-01-26 18:30:14.933742315 +0000 UTC m=+0.133490868 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.933 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[fb611919-0e6a-461a-9a18-ce7a4ec4480b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.938 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f0de1bd2-6f8e-4034-9477-668581959583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 NetworkManager[48993]: <info>  [1769452214.9575] device (tapa6ffe169-50): carrier: link connected
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.963 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[7b820423-0e7c-439e-b2c6-3e013f13b2cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.979 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[25024533-b126-47c3-b7be-9b8de8323944]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6ffe169-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:69:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597500, 'reachable_time': 25474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251564, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:14.992 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[33cb347e-0e92-43a6-96a3-5db99876760a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:69b6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 597500, 'tstamp': 597500}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251565, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.011 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bb654091-0edb-4f0b-91f5-0bfa5310be46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6ffe169-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:69:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597500, 'reachable_time': 25474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251566, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.047 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8d2134-aaad-4487-a163-53baed76ae37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.115 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[771de365-fb0e-4b3d-99cb-7d8ea1fdff08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.117 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6ffe169-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.118 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.118 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6ffe169-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.121 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:15 np0005596062 kernel: tapa6ffe169-50: entered promiscuous mode
Jan 26 13:30:15 np0005596062 NetworkManager[48993]: <info>  [1769452215.1223] manager: (tapa6ffe169-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.125 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6ffe169-50, col_values=(('external_ids', {'iface-id': '57b3bf6f-2b11-4a16-b6fc-c8194ada158a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.127 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:15 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:15Z|00148|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:30:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:15.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.129 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.130 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6ffe169-5606-433e-936f-c0a2554b460d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6ffe169-5606-433e-936f-c0a2554b460d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.132 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cd266ce2-d361-4712-a681-60286fcbb908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.133 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-a6ffe169-5606-433e-936f-c0a2554b460d
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/a6ffe169-5606-433e-936f-c0a2554b460d.pid.haproxy
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID a6ffe169-5606-433e-936f-c0a2554b460d
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:30:15 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:15.134 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'env', 'PROCESS_TAG=haproxy-a6ffe169-5606-433e-936f-c0a2554b460d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6ffe169-5606-433e-936f-c0a2554b460d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.144 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.207 227317 DEBUG nova.compute.manager [req-eb053e4c-74e4-46cc-acc0-cceb7072be46 req-0573fc9d-7bc8-47e7-878c-faec192a1b45 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.207 227317 DEBUG oslo_concurrency.lockutils [req-eb053e4c-74e4-46cc-acc0-cceb7072be46 req-0573fc9d-7bc8-47e7-878c-faec192a1b45 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.207 227317 DEBUG oslo_concurrency.lockutils [req-eb053e4c-74e4-46cc-acc0-cceb7072be46 req-0573fc9d-7bc8-47e7-878c-faec192a1b45 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.208 227317 DEBUG oslo_concurrency.lockutils [req-eb053e4c-74e4-46cc-acc0-cceb7072be46 req-0573fc9d-7bc8-47e7-878c-faec192a1b45 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.208 227317 DEBUG nova.compute.manager [req-eb053e4c-74e4-46cc-acc0-cceb7072be46 req-0573fc9d-7bc8-47e7-878c-faec192a1b45 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Processing event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.431 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.432 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452215.4300642, 7248159b-fc8f-4676-ae14-d348d1874528 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.432 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] VM Started (Lifecycle Event)#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.437 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.441 227317 INFO nova.virt.libvirt.driver [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Instance spawned successfully.#033[00m
Jan 26 13:30:15 np0005596062 nova_compute[227313]: 2026-01-26 18:30:15.441 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:30:15 np0005596062 podman[251640]: 2026-01-26 18:30:15.545771728 +0000 UTC m=+0.045906211 container create 3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:30:15 np0005596062 systemd[1]: Started libpod-conmon-3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9.scope.
Jan 26 13:30:15 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:30:15 np0005596062 podman[251640]: 2026-01-26 18:30:15.521092472 +0000 UTC m=+0.021226975 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:30:15 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f8ab32e093f36863b35fac5f2effd6d1f4e7a7edc8769d18ff3115c29d7fd82/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:30:15 np0005596062 podman[251640]: 2026-01-26 18:30:15.62902932 +0000 UTC m=+0.129163823 container init 3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:30:15 np0005596062 podman[251640]: 2026-01-26 18:30:15.634028393 +0000 UTC m=+0.134162886 container start 3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 13:30:15 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [NOTICE]   (251659) : New worker (251661) forked
Jan 26 13:30:15 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [NOTICE]   (251659) : Loading success.
Jan 26 13:30:16 np0005596062 nova_compute[227313]: 2026-01-26 18:30:16.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:16 np0005596062 nova_compute[227313]: 2026-01-26 18:30:16.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:30:16 np0005596062 nova_compute[227313]: 2026-01-26 18:30:16.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:30:16 np0005596062 nova_compute[227313]: 2026-01-26 18:30:16.120 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:30:16 np0005596062 nova_compute[227313]: 2026-01-26 18:30:16.125 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:30:16 np0005596062 nova_compute[227313]: 2026-01-26 18:30:16.817 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:16.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.068 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.068 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.069 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.069 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.070 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.072 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.072 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.072 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.073 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.073 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.073 227317 DEBUG nova.virt.libvirt.driver [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:30:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:17.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.140 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.141 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452215.430676, 7248159b-fc8f-4676-ae14-d348d1874528 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.141 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.379 227317 INFO nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Took 4.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.379 227317 DEBUG nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.512 227317 DEBUG nova.compute.manager [req-f7430196-36d0-478e-b29b-d077a2551da1 req-b27a4420-b3f0-4041-878b-7faa9497c02c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.512 227317 DEBUG oslo_concurrency.lockutils [req-f7430196-36d0-478e-b29b-d077a2551da1 req-b27a4420-b3f0-4041-878b-7faa9497c02c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.512 227317 DEBUG oslo_concurrency.lockutils [req-f7430196-36d0-478e-b29b-d077a2551da1 req-b27a4420-b3f0-4041-878b-7faa9497c02c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.513 227317 DEBUG oslo_concurrency.lockutils [req-f7430196-36d0-478e-b29b-d077a2551da1 req-b27a4420-b3f0-4041-878b-7faa9497c02c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.513 227317 DEBUG nova.compute.manager [req-f7430196-36d0-478e-b29b-d077a2551da1 req-b27a4420-b3f0-4041-878b-7faa9497c02c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] No waiting events found dispatching network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.513 227317 WARNING nova.compute.manager [req-f7430196-36d0-478e-b29b-d077a2551da1 req-b27a4420-b3f0-4041-878b-7faa9497c02c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received unexpected event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc for instance with vm_state building and task_state spawning.#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.530 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.534 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452215.4364343, 7248159b-fc8f-4676-ae14-d348d1874528 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.534 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.603 227317 INFO nova.compute.manager [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Took 22.49 seconds to build instance.#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.608 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.613 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:30:17 np0005596062 nova_compute[227313]: 2026-01-26 18:30:17.820 227317 DEBUG oslo_concurrency.lockutils [None req-3563de97-c9ae-4391-a0cb-8cdd1dd1905e 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:18 np0005596062 nova_compute[227313]: 2026-01-26 18:30:18.842 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:18.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:19 np0005596062 nova_compute[227313]: 2026-01-26 18:30:19.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:19 np0005596062 nova_compute[227313]: 2026-01-26 18:30:19.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:30:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:19.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:30:19 np0005596062 nova_compute[227313]: 2026-01-26 18:30:19.238 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.036 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.037 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.037 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.037 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.038 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.039 227317 INFO nova.compute.manager [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Terminating instance#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.040 227317 DEBUG nova.compute.manager [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:30:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:21.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:21 np0005596062 kernel: tapf073758b-10 (unregistering): left promiscuous mode
Jan 26 13:30:21 np0005596062 NetworkManager[48993]: <info>  [1769452221.1795] device (tapf073758b-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:30:21 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:21Z|00149|binding|INFO|Releasing lport f073758b-10b1-4f5b-9fd0-04c5020c56dc from this chassis (sb_readonly=0)
Jan 26 13:30:21 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:21Z|00150|binding|INFO|Setting lport f073758b-10b1-4f5b-9fd0-04c5020c56dc down in Southbound
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.183 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 ovn_controller[133984]: 2026-01-26T18:30:21Z|00151|binding|INFO|Removing iface tapf073758b-10 ovn-installed in OVS
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.184 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.203 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 26 13:30:21 np0005596062 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000013.scope: Consumed 6.442s CPU time.
Jan 26 13:30:21 np0005596062 systemd-machined[195380]: Machine qemu-15-instance-00000013 terminated.
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.257 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.262 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.272 227317 INFO nova.virt.libvirt.driver [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Instance destroyed successfully.#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.273 227317 DEBUG nova.objects.instance [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lazy-loading 'resources' on Instance uuid 7248159b-fc8f-4676-ae14-d348d1874528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:30:21 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:21.635 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:80:6d 10.100.0.14'], port_security=['fa:16:3e:fb:80:6d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7248159b-fc8f-4676-ae14-d348d1874528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6ffe169-5606-433e-936f-c0a2554b460d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b158396183b64160b56d3c4df4ae6550', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2073c991-7fbf-4f8f-a63c-07abc42a736f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd476994-76c7-4ad1-88ba-247776af23a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=f073758b-10b1-4f5b-9fd0-04c5020c56dc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:30:21 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:21.637 143929 INFO neutron.agent.ovn.metadata.agent [-] Port f073758b-10b1-4f5b-9fd0-04c5020c56dc in datapath a6ffe169-5606-433e-936f-c0a2554b460d unbound from our chassis#033[00m
Jan 26 13:30:21 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:21.638 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6ffe169-5606-433e-936f-c0a2554b460d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:30:21 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:21.640 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fa8c8e1f-fced-43c6-be73-23885f591631]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:21 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:21.641 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d namespace which is not needed anymore#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.748 227317 DEBUG nova.virt.libvirt.vif [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-733700910',display_name='tempest-TestVolumeBootPattern-server-733700910',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-733700910',id=19,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:30:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b158396183b64160b56d3c4df4ae6550',ramdisk_id='',reservation_id='r-cmy0eofn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1200254346',owner_user_name='tempest-TestVolumeBootPattern-1200254346-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:30:17Z,user_data=None,user_id='1859ed83e26a48fdadcb5b9899dae46e',uuid=7248159b-fc8f-4676-ae14-d348d1874528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.749 227317 DEBUG nova.network.os_vif_util [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converting VIF {"id": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "address": "fa:16:3e:fb:80:6d", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf073758b-10", "ovs_interfaceid": "f073758b-10b1-4f5b-9fd0-04c5020c56dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.750 227317 DEBUG nova.network.os_vif_util [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.750 227317 DEBUG os_vif [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.752 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.752 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf073758b-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.781 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.783 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.786 227317 INFO os_vif [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:6d,bridge_name='br-int',has_traffic_filtering=True,id=f073758b-10b1-4f5b-9fd0-04c5020c56dc,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf073758b-10')#033[00m
Jan 26 13:30:21 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [NOTICE]   (251659) : haproxy version is 2.8.14-c23fe91
Jan 26 13:30:21 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [NOTICE]   (251659) : path to executable is /usr/sbin/haproxy
Jan 26 13:30:21 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [WARNING]  (251659) : Exiting Master process...
Jan 26 13:30:21 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [ALERT]    (251659) : Current worker (251661) exited with code 143 (Terminated)
Jan 26 13:30:21 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[251655]: [WARNING]  (251659) : All workers exited. Exiting... (0)
Jan 26 13:30:21 np0005596062 systemd[1]: libpod-3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9.scope: Deactivated successfully.
Jan 26 13:30:21 np0005596062 podman[251757]: 2026-01-26 18:30:21.933843864 +0000 UTC m=+0.196776010 container died 3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:30:21 np0005596062 nova_compute[227313]: 2026-01-26 18:30:21.946 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.010 227317 DEBUG nova.compute.manager [req-23e2454d-a236-46c7-aaa4-b663b9906aa4 req-15c78c3f-8901-4664-8c44-0750ea781c09 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-vif-unplugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.011 227317 DEBUG oslo_concurrency.lockutils [req-23e2454d-a236-46c7-aaa4-b663b9906aa4 req-15c78c3f-8901-4664-8c44-0750ea781c09 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.012 227317 DEBUG oslo_concurrency.lockutils [req-23e2454d-a236-46c7-aaa4-b663b9906aa4 req-15c78c3f-8901-4664-8c44-0750ea781c09 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.012 227317 DEBUG oslo_concurrency.lockutils [req-23e2454d-a236-46c7-aaa4-b663b9906aa4 req-15c78c3f-8901-4664-8c44-0750ea781c09 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.012 227317 DEBUG nova.compute.manager [req-23e2454d-a236-46c7-aaa4-b663b9906aa4 req-15c78c3f-8901-4664-8c44-0750ea781c09 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] No waiting events found dispatching network-vif-unplugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.012 227317 DEBUG nova.compute.manager [req-23e2454d-a236-46c7-aaa4-b663b9906aa4 req-15c78c3f-8901-4664-8c44-0750ea781c09 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-vif-unplugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:30:22 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9-userdata-shm.mount: Deactivated successfully.
Jan 26 13:30:22 np0005596062 systemd[1]: var-lib-containers-storage-overlay-6f8ab32e093f36863b35fac5f2effd6d1f4e7a7edc8769d18ff3115c29d7fd82-merged.mount: Deactivated successfully.
Jan 26 13:30:22 np0005596062 podman[251757]: 2026-01-26 18:30:22.141237335 +0000 UTC m=+0.404169481 container cleanup 3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 13:30:22 np0005596062 systemd[1]: libpod-conmon-3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9.scope: Deactivated successfully.
Jan 26 13:30:22 np0005596062 podman[251805]: 2026-01-26 18:30:22.590466871 +0000 UTC m=+0.425971259 container remove 3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.596 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9683ee-4294-4312-9b2e-5e66d176133b]: (4, ('Mon Jan 26 06:30:21 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d (3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9)\n3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9\nMon Jan 26 06:30:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d (3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9)\n3024e53ec774c2299148a3f4c7ec33277707a6affe05d85d1df93898cb1132e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.598 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ca81e073-078a-4479-bc8d-99d1cc29b38b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.599 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6ffe169-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:22 np0005596062 kernel: tapa6ffe169-50: left promiscuous mode
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.614 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.618 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ab04d9b3-43d1-4756-b912-05544eb0df48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.632 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b3907cf7-0709-4eed-92da-27153dab1d45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.634 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[094ad89e-3e1d-45ba-9020-81e4b18ee567]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.652 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b1302cf4-5ebc-4971-b02b-58738f53f3f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597494, 'reachable_time': 38780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251821, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 systemd[1]: run-netns-ovnmeta\x2da6ffe169\x2d5606\x2d433e\x2d936f\x2dc0a2554b460d.mount: Deactivated successfully.
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.655 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:30:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:22.655 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bf9a34-077d-457c-8069-ed3624709023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.673 227317 INFO nova.virt.libvirt.driver [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Deleting instance files /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528_del#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.674 227317 INFO nova.virt.libvirt.driver [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Deletion of /var/lib/nova/instances/7248159b-fc8f-4676-ae14-d348d1874528_del complete#033[00m
Jan 26 13:30:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:22.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.994 227317 INFO nova.compute.manager [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Took 1.95 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.994 227317 DEBUG oslo.service.loopingcall [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.994 227317 DEBUG nova.compute.manager [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:30:22 np0005596062 nova_compute[227313]: 2026-01-26 18:30:22.995 227317 DEBUG nova.network.neutron [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:30:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:23.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:24 np0005596062 nova_compute[227313]: 2026-01-26 18:30:24.701 227317 DEBUG nova.compute.manager [req-752715a5-a922-49ea-8d38-d4a50544790d req-ca6805d7-8beb-4459-a78b-23107e982b66 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:30:24 np0005596062 nova_compute[227313]: 2026-01-26 18:30:24.701 227317 DEBUG oslo_concurrency.lockutils [req-752715a5-a922-49ea-8d38-d4a50544790d req-ca6805d7-8beb-4459-a78b-23107e982b66 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "7248159b-fc8f-4676-ae14-d348d1874528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:24 np0005596062 nova_compute[227313]: 2026-01-26 18:30:24.702 227317 DEBUG oslo_concurrency.lockutils [req-752715a5-a922-49ea-8d38-d4a50544790d req-ca6805d7-8beb-4459-a78b-23107e982b66 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:24 np0005596062 nova_compute[227313]: 2026-01-26 18:30:24.702 227317 DEBUG oslo_concurrency.lockutils [req-752715a5-a922-49ea-8d38-d4a50544790d req-ca6805d7-8beb-4459-a78b-23107e982b66 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:24 np0005596062 nova_compute[227313]: 2026-01-26 18:30:24.702 227317 DEBUG nova.compute.manager [req-752715a5-a922-49ea-8d38-d4a50544790d req-ca6805d7-8beb-4459-a78b-23107e982b66 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] No waiting events found dispatching network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:30:24 np0005596062 nova_compute[227313]: 2026-01-26 18:30:24.702 227317 WARNING nova.compute.manager [req-752715a5-a922-49ea-8d38-d4a50544790d req-ca6805d7-8beb-4459-a78b-23107e982b66 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received unexpected event network-vif-plugged-f073758b-10b1-4f5b-9fd0-04c5020c56dc for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:30:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:24.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:25.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:26 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:30:26 np0005596062 nova_compute[227313]: 2026-01-26 18:30:26.576 227317 DEBUG nova.network.neutron [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:30:26 np0005596062 nova_compute[227313]: 2026-01-26 18:30:26.782 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:26 np0005596062 nova_compute[227313]: 2026-01-26 18:30:26.821 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:26.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:30:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:27.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.300 227317 DEBUG nova.compute.manager [req-52a7576e-1d7a-45fc-9a5a-fd4e757678a2 req-d6702578-64d9-479d-a4f9-1ca4528862e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Received event network-vif-deleted-f073758b-10b1-4f5b-9fd0-04c5020c56dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.300 227317 INFO nova.compute.manager [req-52a7576e-1d7a-45fc-9a5a-fd4e757678a2 req-d6702578-64d9-479d-a4f9-1ca4528862e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Neutron deleted interface f073758b-10b1-4f5b-9fd0-04c5020c56dc; detaching it from the instance and deleting it from the info cache#033[00m
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.300 227317 DEBUG nova.network.neutron [req-52a7576e-1d7a-45fc-9a5a-fd4e757678a2 req-d6702578-64d9-479d-a4f9-1ca4528862e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.302 227317 INFO nova.compute.manager [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Took 4.31 seconds to deallocate network for instance.#033[00m
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.324 227317 DEBUG nova.compute.manager [req-52a7576e-1d7a-45fc-9a5a-fd4e757678a2 req-d6702578-64d9-479d-a4f9-1ca4528862e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Detach interface failed, port_id=f073758b-10b1-4f5b-9fd0-04c5020c56dc, reason: Instance 7248159b-fc8f-4676-ae14-d348d1874528 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.831 227317 INFO nova.compute.manager [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Took 0.53 seconds to detach 1 volumes for instance.#033[00m
Jan 26 13:30:27 np0005596062 nova_compute[227313]: 2026-01-26 18:30:27.832 227317 DEBUG nova.compute.manager [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Deleting volume: 92cd157b-154d-425b-bab9-ce127490a9ca _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 26 13:30:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:30:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:30:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:30:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:28.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:30:28 np0005596062 nova_compute[227313]: 2026-01-26 18:30:28.936 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:30:28 np0005596062 nova_compute[227313]: 2026-01-26 18:30:28.936 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:30:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:29 np0005596062 nova_compute[227313]: 2026-01-26 18:30:29.018 227317 DEBUG oslo_concurrency.processutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:30:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:29.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:30:29 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/396894197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:30:29 np0005596062 nova_compute[227313]: 2026-01-26 18:30:29.518 227317 DEBUG oslo_concurrency.processutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:30:29 np0005596062 nova_compute[227313]: 2026-01-26 18:30:29.525 227317 DEBUG nova.compute.provider_tree [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:30:29 np0005596062 nova_compute[227313]: 2026-01-26 18:30:29.735 227317 DEBUG nova.scheduler.client.report [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:30:29 np0005596062 nova_compute[227313]: 2026-01-26 18:30:29.837 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:30 np0005596062 nova_compute[227313]: 2026-01-26 18:30:30.027 227317 INFO nova.scheduler.client.report [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Deleted allocations for instance 7248159b-fc8f-4676-ae14-d348d1874528#033[00m
Jan 26 13:30:30 np0005596062 nova_compute[227313]: 2026-01-26 18:30:30.144 227317 DEBUG oslo_concurrency.lockutils [None req-beb282db-558e-47d5-852e-ceeccfc57c1d 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "7248159b-fc8f-4676-ae14-d348d1874528" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:30:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:30.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:30:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:31.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:30:31 np0005596062 nova_compute[227313]: 2026-01-26 18:30:31.786 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:31 np0005596062 nova_compute[227313]: 2026-01-26 18:30:31.823 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:30:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:32.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:30:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:33.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 26 13:30:33 np0005596062 podman[251980]: 2026-01-26 18:30:33.858004468 +0000 UTC m=+0.059196585 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 26 13:30:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:34.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:35.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:36 np0005596062 nova_compute[227313]: 2026-01-26 18:30:36.272 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452221.2708426, 7248159b-fc8f-4676-ae14-d348d1874528 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:30:36 np0005596062 nova_compute[227313]: 2026-01-26 18:30:36.272 227317 INFO nova.compute.manager [-] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:30:36 np0005596062 nova_compute[227313]: 2026-01-26 18:30:36.334 227317 DEBUG nova.compute.manager [None req-3088dbd7-dee1-4c38-890a-9204a39fced3 - - - - - -] [instance: 7248159b-fc8f-4676-ae14-d348d1874528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:30:36 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:30:36 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:30:36 np0005596062 nova_compute[227313]: 2026-01-26 18:30:36.859 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4994-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:30:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:36.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:30:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:37.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:30:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:38.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:39.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:39 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 26 13:30:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:30:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:40.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:30:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:41.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:41 np0005596062 nova_compute[227313]: 2026-01-26 18:30:41.862 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:30:41 np0005596062 nova_compute[227313]: 2026-01-26 18:30:41.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:41 np0005596062 nova_compute[227313]: 2026-01-26 18:30:41.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 13:30:41 np0005596062 nova_compute[227313]: 2026-01-26 18:30:41.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:30:41 np0005596062 nova_compute[227313]: 2026-01-26 18:30:41.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 13:30:41 np0005596062 nova_compute[227313]: 2026-01-26 18:30:41.864 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:42.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:43.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:44 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:44 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 26 13:30:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:45.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:45 np0005596062 podman[252107]: 2026-01-26 18:30:45.86142117 +0000 UTC m=+0.076746540 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:30:46 np0005596062 nova_compute[227313]: 2026-01-26 18:30:46.865 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:30:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:47.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:48 np0005596062 nova_compute[227313]: 2026-01-26 18:30:48.325 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:48 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:48.325 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:30:48 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:48.328 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:30:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:48.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:49.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:49 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:30:49.330 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:30:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:50.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:30:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:51.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:30:51 np0005596062 nova_compute[227313]: 2026-01-26 18:30:51.867 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:52.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:53.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:54.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:55.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:55 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:30:56 np0005596062 nova_compute[227313]: 2026-01-26 18:30:56.869 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:30:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:56.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:57.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:30:58.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:30:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:30:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:30:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:30:59.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:00.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:01.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:01 np0005596062 nova_compute[227313]: 2026-01-26 18:31:01.870 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:31:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:02.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:03.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:04 np0005596062 podman[252193]: 2026-01-26 18:31:04.879929196 +0000 UTC m=+0.077092797 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:31:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:31:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:04.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:31:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:05.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:06 np0005596062 nova_compute[227313]: 2026-01-26 18:31:06.872 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:06.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:07.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:08.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:09.181 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:09.182 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:09.182 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:09.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:10.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:11.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:11 np0005596062 nova_compute[227313]: 2026-01-26 18:31:11.581 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:11 np0005596062 nova_compute[227313]: 2026-01-26 18:31:11.582 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:11 np0005596062 nova_compute[227313]: 2026-01-26 18:31:11.625 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:31:11 np0005596062 nova_compute[227313]: 2026-01-26 18:31:11.874 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.026 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.027 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.046 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.047 227317 INFO nova.compute.claims [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.231 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:12 np0005596062 nova_compute[227313]: 2026-01-26 18:31:12.595 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:12.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:31:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/827541199' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.160 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.167 227317 DEBUG nova.compute.provider_tree [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.199 227317 DEBUG nova.scheduler.client.report [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:31:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:31:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:13.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.394 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.395 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.399 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.399 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.400 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.401 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.497 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.497 227317 DEBUG nova.network.neutron [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.532 227317 INFO nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.586 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.654 227317 INFO nova.virt.block_device [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Booting with volume f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c at /dev/vda#033[00m
Jan 26 13:31:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:31:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2294694477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:31:13 np0005596062 nova_compute[227313]: 2026-01-26 18:31:13.925 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.043 227317 DEBUG os_brick.utils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.045 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.058 232828 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.058 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[79f5ac8d-fe1c-4ed6-89fc-6adb8c9aadcf]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.060 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.069 232828 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.069 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[102ebd6c-8bb6-4e58-9bde-45283e325ff2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:c828cff26df4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.072 232828 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.081 232828 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.081 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[b88ce185-92bb-4413-a8f0-fee4ff04e1bf]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.082 232828 DEBUG oslo.privsep.daemon [-] privsep: reply[e328b805-d3e5-4351-a855-6948f5624ece]: (4, '5c33c4b0-14ac-46af-8c94-d3bb1b6300af') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.083 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.106 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.109 227317 DEBUG os_brick.initiator.connectors.lightos [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.110 227317 DEBUG os_brick.initiator.connectors.lightos [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.110 227317 DEBUG os_brick.initiator.connectors.lightos [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.110 227317 DEBUG os_brick.utils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:c828cff26df4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5c33c4b0-14ac-46af-8c94-d3bb1b6300af', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.111 227317 DEBUG nova.virt.block_device [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updating existing volume attachment record: 8b0c1e34-c053-4add-b876-ff14fd0e95da _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.163 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.165 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4791MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.165 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.165 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.342 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance acd8b26d-b140-49c9-94cc-9d68fd5fa9bd actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.343 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.343 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.394 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:31:14 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3171681607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.873 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.881 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.902 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:31:14 np0005596062 nova_compute[227313]: 2026-01-26 18:31:14.909 227317 DEBUG nova.policy [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1859ed83e26a48fdadcb5b9899dae46e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b158396183b64160b56d3c4df4ae6550', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:31:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:14.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.016 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.016 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:15.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.853 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.855 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.855 227317 INFO nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Creating image(s)#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.855 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.856 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Ensure instance console log exists: /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.856 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.856 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:15 np0005596062 nova_compute[227313]: 2026-01-26 18:31:15.856 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:16 np0005596062 podman[252292]: 2026-01-26 18:31:16.863103424 +0000 UTC m=+0.076077930 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 13:31:16 np0005596062 nova_compute[227313]: 2026-01-26 18:31:16.876 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:16 np0005596062 nova_compute[227313]: 2026-01-26 18:31:16.899 227317 DEBUG nova.network.neutron [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Successfully created port: 95a388aa-20fa-4cf0-a52b-c0f58db57705 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:31:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:16.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:17 np0005596062 nova_compute[227313]: 2026-01-26 18:31:17.018 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:17 np0005596062 nova_compute[227313]: 2026-01-26 18:31:17.019 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:17 np0005596062 nova_compute[227313]: 2026-01-26 18:31:17.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:17.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.353 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.353 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.353 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:18 np0005596062 nova_compute[227313]: 2026-01-26 18:31:18.354 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:31:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:18.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:31:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:19.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.031 227317 DEBUG nova.network.neutron [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Successfully updated port: 95a388aa-20fa-4cf0-a52b-c0f58db57705 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.099 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.099 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquired lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.100 227317 DEBUG nova.network.neutron [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.249 227317 DEBUG nova.compute.manager [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-changed-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.249 227317 DEBUG nova.compute.manager [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Refreshing instance network info cache due to event network-changed-95a388aa-20fa-4cf0-a52b-c0f58db57705. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.249 227317 DEBUG oslo_concurrency.lockutils [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.349 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:20 np0005596062 nova_compute[227313]: 2026-01-26 18:31:20.408 227317 DEBUG nova.network.neutron [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:31:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:20.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:21 np0005596062 nova_compute[227313]: 2026-01-26 18:31:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:21.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:21 np0005596062 nova_compute[227313]: 2026-01-26 18:31:21.877 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:21 np0005596062 nova_compute[227313]: 2026-01-26 18:31:21.879 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.505 227317 DEBUG nova.network.neutron [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updating instance_info_cache with network_info: [{"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.578 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Releasing lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.578 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Instance network_info: |[{"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.578 227317 DEBUG oslo_concurrency.lockutils [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.579 227317 DEBUG nova.network.neutron [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Refreshing network info cache for port 95a388aa-20fa-4cf0-a52b-c0f58db57705 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.581 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Start _get_guest_xml network_info=[{"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/vda', 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'attachment_id': '8b0c1e34-c053-4add-b876-ff14fd0e95da', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'acd8b26d-b140-49c9-94cc-9d68fd5fa9bd', 'attached_at': '', 'detached_at': '', 'volume_id': 'f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c', 'serial': 'f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c'}, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.586 227317 WARNING nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.591 227317 DEBUG nova.virt.libvirt.host [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.592 227317 DEBUG nova.virt.libvirt.host [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.604 227317 DEBUG nova.virt.libvirt.host [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.605 227317 DEBUG nova.virt.libvirt.host [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.606 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.607 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.607 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.608 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.608 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.608 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.609 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.609 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.609 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.610 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.610 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.610 227317 DEBUG nova.virt.hardware [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.643 227317 DEBUG nova.storage.rbd_utils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] rbd image acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:31:22 np0005596062 nova_compute[227313]: 2026-01-26 18:31:22.648 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:22.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:31:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:31:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1482287444' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.119 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:23.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.373 227317 DEBUG nova.virt.libvirt.vif [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1269771433',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1269771433',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1269771433',id=21,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInDz+GLYJtk95PlzrHqTNQERTx1bynGIuwweyi5YHrc/aXQ2pURgEiq/Gs5/yI9jMLkStr288XvF2jTexSQOlBlxgRG2TOYDl2OkNKCPigp/5UAqde30Xz/zQyXB4jPlQ==',key_name='tempest-keypair-809081161',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b158396183b64160b56d3c4df4ae6550',ramdisk_id='',reservation_id='r-fta0379e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200254346',owner_user_name='tempest-TestVolumeBootPattern-1200254346-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:31:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1859ed83e26a48fdadcb5b9899dae46e',uuid=acd8b26d-b140-49c9-94cc-9d68fd5fa9bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.374 227317 DEBUG nova.network.os_vif_util [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converting VIF {"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.375 227317 DEBUG nova.network.os_vif_util [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.376 227317 DEBUG nova.objects.instance [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lazy-loading 'pci_devices' on Instance uuid acd8b26d-b140-49c9-94cc-9d68fd5fa9bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.429 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <uuid>acd8b26d-b140-49c9-94cc-9d68fd5fa9bd</uuid>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <name>instance-00000015</name>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestVolumeBootPattern-volume-backed-server-1269771433</nova:name>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:31:22</nova:creationTime>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:user uuid="1859ed83e26a48fdadcb5b9899dae46e">tempest-TestVolumeBootPattern-1200254346-project-member</nova:user>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:project uuid="b158396183b64160b56d3c4df4ae6550">tempest-TestVolumeBootPattern-1200254346</nova:project>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <nova:port uuid="95a388aa-20fa-4cf0-a52b-c0f58db57705">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <entry name="serial">acd8b26d-b140-49c9-94cc-9d68fd5fa9bd</entry>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <entry name="uuid">acd8b26d-b140-49c9-94cc-9d68fd5fa9bd</entry>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_disk.config">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="volumes/volume-f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <serial>f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c</serial>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:30:ec:da"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <target dev="tap95a388aa-20"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/console.log" append="off"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:31:23 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:31:23 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:31:23 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:31:23 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.431 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Preparing to wait for external event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.431 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.432 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.432 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.433 227317 DEBUG nova.virt.libvirt.vif [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1269771433',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1269771433',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1269771433',id=21,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInDz+GLYJtk95PlzrHqTNQERTx1bynGIuwweyi5YHrc/aXQ2pURgEiq/Gs5/yI9jMLkStr288XvF2jTexSQOlBlxgRG2TOYDl2OkNKCPigp/5UAqde30Xz/zQyXB4jPlQ==',key_name='tempest-keypair-809081161',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b158396183b64160b56d3c4df4ae6550',ramdisk_id='',reservation_id='r-fta0379e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-1200254346',owner_user_name='tempest-TestVolumeBootPattern-1200254346-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:31:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1859ed83e26a48fdadcb5b9899dae46e',uuid=acd8b26d-b140-49c9-94cc-9d68fd5fa9bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.433 227317 DEBUG nova.network.os_vif_util [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converting VIF {"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.434 227317 DEBUG nova.network.os_vif_util [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.434 227317 DEBUG os_vif [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.435 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.435 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.436 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.439 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.439 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95a388aa-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.440 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap95a388aa-20, col_values=(('external_ids', {'iface-id': '95a388aa-20fa-4cf0-a52b-c0f58db57705', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:ec:da', 'vm-uuid': 'acd8b26d-b140-49c9-94cc-9d68fd5fa9bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.442 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:23 np0005596062 NetworkManager[48993]: <info>  [1769452283.4433] manager: (tap95a388aa-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.444 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.448 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.449 227317 INFO os_vif [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20')#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.510 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.510 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.511 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] No VIF found with MAC fa:16:3e:30:ec:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.511 227317 INFO nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Using config drive#033[00m
Jan 26 13:31:23 np0005596062 nova_compute[227313]: 2026-01-26 18:31:23.533 227317 DEBUG nova.storage.rbd_utils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] rbd image acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.073 227317 INFO nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Creating config drive at /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/disk.config#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.082 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmci8xhvi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.217 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmci8xhvi" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.246 227317 DEBUG nova.storage.rbd_utils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] rbd image acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.250 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/disk.config acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.433 227317 DEBUG oslo_concurrency.processutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/disk.config acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.434 227317 INFO nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Deleting local config drive /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd/disk.config because it was imported into RBD.#033[00m
Jan 26 13:31:24 np0005596062 kernel: tap95a388aa-20: entered promiscuous mode
Jan 26 13:31:24 np0005596062 NetworkManager[48993]: <info>  [1769452284.4989] manager: (tap95a388aa-20): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Jan 26 13:31:24 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:24Z|00152|binding|INFO|Claiming lport 95a388aa-20fa-4cf0-a52b-c0f58db57705 for this chassis.
Jan 26 13:31:24 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:24Z|00153|binding|INFO|95a388aa-20fa-4cf0-a52b-c0f58db57705: Claiming fa:16:3e:30:ec:da 10.100.0.10
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.505 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:24Z|00154|binding|INFO|Setting lport 95a388aa-20fa-4cf0-a52b-c0f58db57705 ovn-installed in OVS
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.523 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.527 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:24Z|00155|binding|INFO|Setting lport 95a388aa-20fa-4cf0-a52b-c0f58db57705 up in Southbound
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.530 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:ec:da 10.100.0.10'], port_security=['fa:16:3e:30:ec:da 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'acd8b26d-b140-49c9-94cc-9d68fd5fa9bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6ffe169-5606-433e-936f-c0a2554b460d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b158396183b64160b56d3c4df4ae6550', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0f7a06fa-e94c-4ca3-aef6-af3df08dfe1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd476994-76c7-4ad1-88ba-247776af23a7, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=95a388aa-20fa-4cf0-a52b-c0f58db57705) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.532 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 95a388aa-20fa-4cf0-a52b-c0f58db57705 in datapath a6ffe169-5606-433e-936f-c0a2554b460d bound to our chassis#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.534 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6ffe169-5606-433e-936f-c0a2554b460d#033[00m
Jan 26 13:31:24 np0005596062 systemd-udevd[252485]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:31:24 np0005596062 systemd-machined[195380]: New machine qemu-16-instance-00000015.
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.545 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2bbca32e-abe5-41bc-9857-7d04b35c30fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.547 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6ffe169-51 in ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:31:24 np0005596062 NetworkManager[48993]: <info>  [1769452284.5491] device (tap95a388aa-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.549 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6ffe169-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.549 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[1a279f85-9be0-424f-aee9-38600256a88b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 NetworkManager[48993]: <info>  [1769452284.5502] device (tap95a388aa-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.550 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6777ad-04a0-4c7a-8d54-18eaa6054eb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.566 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9238ba-7c4a-4964-9ec5-bae3a06f21ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.591 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d4a81d-8dd7-486b-a474-55ee5499d81a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.626 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[ef60129a-cdde-458d-b2e9-1df52b20636a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.632 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bc230dd2-dc66-48c4-a1be-94e4628ce98c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 NetworkManager[48993]: <info>  [1769452284.6342] manager: (tapa6ffe169-50): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.667 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[f50b6b37-1677-4257-9bf2-0fcddc1c2b59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.672 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[10a0193f-5a34-47ef-9aef-cec50062b59a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 NetworkManager[48993]: <info>  [1769452284.7010] device (tapa6ffe169-50): carrier: link connected
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.708 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a9d893-46f4-4d87-bc7e-ffcf7cdf8df1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.728 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5720ae-c4e9-4863-9ec4-9c88aadc1aa3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6ffe169-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:69:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604475, 'reachable_time': 35485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252519, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.752 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5c74cb0d-7455-4bad-bafb-89778a04ad0a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:69b6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 604475, 'tstamp': 604475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252520, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.769 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[072f171d-048d-4320-bee7-7131551c3c62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6ffe169-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:69:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604475, 'reachable_time': 35485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252521, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.799 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c7cb46-51ae-4873-b737-45f3782d5db6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.858 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[992506a4-5350-4a4e-87a7-7575f993f715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.860 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6ffe169-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.860 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.861 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6ffe169-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.863 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 kernel: tapa6ffe169-50: entered promiscuous mode
Jan 26 13:31:24 np0005596062 NetworkManager[48993]: <info>  [1769452284.8651] manager: (tapa6ffe169-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.869 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.871 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6ffe169-50, col_values=(('external_ids', {'iface-id': '57b3bf6f-2b11-4a16-b6fc-c8194ada158a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.872 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:24Z|00156|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.891 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.895 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.896 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6ffe169-5606-433e-936f-c0a2554b460d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6ffe169-5606-433e-936f-c0a2554b460d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.896 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0349f13c-7268-4f2d-8570-894fd9a39e8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.897 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-a6ffe169-5606-433e-936f-c0a2554b460d
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/a6ffe169-5606-433e-936f-c0a2554b460d.pid.haproxy
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID a6ffe169-5606-433e-936f-c0a2554b460d
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:31:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:24.897 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'env', 'PROCESS_TAG=haproxy-a6ffe169-5606-433e-936f-c0a2554b460d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6ffe169-5606-433e-936f-c0a2554b460d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.919 227317 DEBUG nova.network.neutron [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updated VIF entry in instance network info cache for port 95a388aa-20fa-4cf0-a52b-c0f58db57705. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.920 227317 DEBUG nova.network.neutron [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updating instance_info_cache with network_info: [{"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:31:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:24.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:24 np0005596062 nova_compute[227313]: 2026-01-26 18:31:24.964 227317 DEBUG oslo_concurrency.lockutils [req-0221a475-e448-48e1-9b9a-1af72e3c2257 req-541841ec-e3ed-4745-b3af-64334506ca6f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:31:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:25.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:25 np0005596062 podman[252553]: 2026-01-26 18:31:25.291618955 +0000 UTC m=+0.049487951 container create 47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:31:25 np0005596062 systemd[1]: Started libpod-conmon-47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce.scope.
Jan 26 13:31:25 np0005596062 podman[252553]: 2026-01-26 18:31:25.264954384 +0000 UTC m=+0.022823410 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:31:25 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:31:25 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a3f2a5997f46f2d014a1054fe1bd7972795996fb83493bb1dbc80a05dead548/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:31:25 np0005596062 podman[252553]: 2026-01-26 18:31:25.386502706 +0000 UTC m=+0.144371722 container init 47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 13:31:25 np0005596062 podman[252553]: 2026-01-26 18:31:25.394141779 +0000 UTC m=+0.152010775 container start 47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 13:31:25 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [NOTICE]   (252598) : New worker (252609) forked
Jan 26 13:31:25 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [NOTICE]   (252598) : Loading success.
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.430 227317 DEBUG nova.compute.manager [req-da8cb27f-b098-4414-af4f-1528d838738e req-d8db0a62-d5fe-4ab9-9663-2fcea6b6a6db 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.430 227317 DEBUG oslo_concurrency.lockutils [req-da8cb27f-b098-4414-af4f-1528d838738e req-d8db0a62-d5fe-4ab9-9663-2fcea6b6a6db 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.430 227317 DEBUG oslo_concurrency.lockutils [req-da8cb27f-b098-4414-af4f-1528d838738e req-d8db0a62-d5fe-4ab9-9663-2fcea6b6a6db 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.431 227317 DEBUG oslo_concurrency.lockutils [req-da8cb27f-b098-4414-af4f-1528d838738e req-d8db0a62-d5fe-4ab9-9663-2fcea6b6a6db 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.431 227317 DEBUG nova.compute.manager [req-da8cb27f-b098-4414-af4f-1528d838738e req-d8db0a62-d5fe-4ab9-9663-2fcea6b6a6db 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Processing event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.528 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.529 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452285.5276341, acd8b26d-b140-49c9-94cc-9d68fd5fa9bd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.529 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] VM Started (Lifecycle Event)#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.531 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.534 227317 INFO nova.virt.libvirt.driver [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Instance spawned successfully.#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.534 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.563 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.567 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.567 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.568 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.568 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.568 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.569 227317 DEBUG nova.virt.libvirt.driver [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.572 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.617 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.617 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452285.5287452, acd8b26d-b140-49c9-94cc-9d68fd5fa9bd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.618 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.644 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.648 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452285.5308106, acd8b26d-b140-49c9-94cc-9d68fd5fa9bd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.648 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.658 227317 INFO nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Took 9.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.658 227317 DEBUG nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.694 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.696 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.764 227317 INFO nova.compute.manager [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Took 13.82 seconds to build instance.#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.792 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:31:25 np0005596062 nova_compute[227313]: 2026-01-26 18:31:25.793 227317 DEBUG oslo_concurrency.lockutils [None req-2372c291-a202-4006-bce2-60afded57ba1 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:26 np0005596062 nova_compute[227313]: 2026-01-26 18:31:26.880 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:26.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:27.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:27 np0005596062 nova_compute[227313]: 2026-01-26 18:31:27.726 227317 DEBUG nova.compute.manager [req-29ee7bc3-afbf-442d-bb4a-5af361918fc6 req-beb7850a-74b1-4fb3-917c-dbed26064cfd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:31:27 np0005596062 nova_compute[227313]: 2026-01-26 18:31:27.726 227317 DEBUG oslo_concurrency.lockutils [req-29ee7bc3-afbf-442d-bb4a-5af361918fc6 req-beb7850a-74b1-4fb3-917c-dbed26064cfd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:31:27 np0005596062 nova_compute[227313]: 2026-01-26 18:31:27.727 227317 DEBUG oslo_concurrency.lockutils [req-29ee7bc3-afbf-442d-bb4a-5af361918fc6 req-beb7850a-74b1-4fb3-917c-dbed26064cfd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:31:27 np0005596062 nova_compute[227313]: 2026-01-26 18:31:27.727 227317 DEBUG oslo_concurrency.lockutils [req-29ee7bc3-afbf-442d-bb4a-5af361918fc6 req-beb7850a-74b1-4fb3-917c-dbed26064cfd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:31:27 np0005596062 nova_compute[227313]: 2026-01-26 18:31:27.727 227317 DEBUG nova.compute.manager [req-29ee7bc3-afbf-442d-bb4a-5af361918fc6 req-beb7850a-74b1-4fb3-917c-dbed26064cfd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] No waiting events found dispatching network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:31:27 np0005596062 nova_compute[227313]: 2026-01-26 18:31:27.728 227317 WARNING nova.compute.manager [req-29ee7bc3-afbf-442d-bb4a-5af361918fc6 req-beb7850a-74b1-4fb3-917c-dbed26064cfd 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received unexpected event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:31:28 np0005596062 nova_compute[227313]: 2026-01-26 18:31:28.442 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:31:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:28.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:31:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:31:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:31:30 np0005596062 NetworkManager[48993]: <info>  [1769452290.0806] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 26 13:31:30 np0005596062 NetworkManager[48993]: <info>  [1769452290.0821] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.080 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.334 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:30 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:30Z|00157|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.360 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.922384) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452290922516, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1476, "num_deletes": 251, "total_data_size": 3292348, "memory_usage": 3339640, "flush_reason": "Manual Compaction"}
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452290934101, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1299951, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39309, "largest_seqno": 40780, "table_properties": {"data_size": 1295147, "index_size": 2136, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13063, "raw_average_key_size": 21, "raw_value_size": 1284514, "raw_average_value_size": 2075, "num_data_blocks": 95, "num_entries": 619, "num_filter_entries": 619, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452165, "oldest_key_time": 1769452165, "file_creation_time": 1769452290, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 11774 microseconds, and 5734 cpu microseconds.
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.934176) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1299951 bytes OK
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.934204) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.936261) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.936276) EVENT_LOG_v1 {"time_micros": 1769452290936271, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.936303) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3285533, prev total WAL file size 3285533, number of live WAL files 2.
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.937386) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323533' seq:72057594037927935, type:22 .. '6D6772737461740031353034' seq:0, type:0; will stop at (end)
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1269KB)], [75(10MB)]
Jan 26 13:31:30 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452290937470, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 12480179, "oldest_snapshot_seqno": -1}
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.943 227317 DEBUG nova.compute.manager [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-changed-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.945 227317 DEBUG nova.compute.manager [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Refreshing instance network info cache due to event network-changed-95a388aa-20fa-4cf0-a52b-c0f58db57705. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.945 227317 DEBUG oslo_concurrency.lockutils [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.945 227317 DEBUG oslo_concurrency.lockutils [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:31:30 np0005596062 nova_compute[227313]: 2026-01-26 18:31:30.946 227317 DEBUG nova.network.neutron [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Refreshing network info cache for port 95a388aa-20fa-4cf0-a52b-c0f58db57705 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:31:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:30.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6256 keys, 9503330 bytes, temperature: kUnknown
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452291009338, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 9503330, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9462714, "index_size": 23866, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 160877, "raw_average_key_size": 25, "raw_value_size": 9351261, "raw_average_value_size": 1494, "num_data_blocks": 958, "num_entries": 6256, "num_filter_entries": 6256, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452290, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.009628) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 9503330 bytes
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.011380) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.5 rd, 132.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 10.7 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(16.9) write-amplify(7.3) OK, records in: 6721, records dropped: 465 output_compression: NoCompression
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.011400) EVENT_LOG_v1 {"time_micros": 1769452291011390, "job": 46, "event": "compaction_finished", "compaction_time_micros": 71945, "compaction_time_cpu_micros": 24797, "output_level": 6, "num_output_files": 1, "total_output_size": 9503330, "num_input_records": 6721, "num_output_records": 6256, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452291011731, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452291013420, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:30.937290) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.013463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.013468) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.013470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.013472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:31:31.013474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:31:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:31.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:31 np0005596062 nova_compute[227313]: 2026-01-26 18:31:31.884 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:31:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:32.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:31:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:33.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:33 np0005596062 nova_compute[227313]: 2026-01-26 18:31:33.446 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:34.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:34 np0005596062 nova_compute[227313]: 2026-01-26 18:31:34.987 227317 DEBUG nova.network.neutron [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updated VIF entry in instance network info cache for port 95a388aa-20fa-4cf0-a52b-c0f58db57705. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:31:34 np0005596062 nova_compute[227313]: 2026-01-26 18:31:34.988 227317 DEBUG nova.network.neutron [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updating instance_info_cache with network_info: [{"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:31:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:35.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:35 np0005596062 nova_compute[227313]: 2026-01-26 18:31:35.439 227317 DEBUG oslo_concurrency.lockutils [req-64a577a2-d681-4322-90a1-70d1c512cd16 req-99b37032-126e-4a0e-8bd4-cb773a41eccf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:31:35 np0005596062 podman[252632]: 2026-01-26 18:31:35.857494027 +0000 UTC m=+0.061489731 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 26 13:31:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:36 np0005596062 nova_compute[227313]: 2026-01-26 18:31:36.920 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:36.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:37.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:37 np0005596062 nova_compute[227313]: 2026-01-26 18:31:37.246 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:31:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:31:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:31:38 np0005596062 nova_compute[227313]: 2026-01-26 18:31:38.450 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:38.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:39.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:40.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:41.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:41 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:ec:da 10.100.0.10
Jan 26 13:31:41 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:41Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:ec:da 10.100.0.10
Jan 26 13:31:41 np0005596062 nova_compute[227313]: 2026-01-26 18:31:41.921 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:42.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:43.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:43 np0005596062 nova_compute[227313]: 2026-01-26 18:31:43.453 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:31:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8112 writes, 40K keys, 8112 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8111 writes, 8111 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1565 writes, 7902 keys, 1565 commit groups, 1.0 writes per commit group, ingest: 15.74 MB, 0.03 MB/s#012Interval WAL: 1564 writes, 1564 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     70.3      0.72              0.17        23    0.031       0      0       0.0       0.0#012  L6      1/0    9.06 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   3.7     82.5     67.7      2.81              0.60        22    0.128    119K    12K       0.0       0.0#012 Sum      1/0    9.06 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   4.7     65.6     68.2      3.53              0.78        45    0.078    119K    12K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.7     72.9     73.0      0.95              0.22        12    0.080     39K   3565       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0     82.5     67.7      2.81              0.60        22    0.128    119K    12K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     70.5      0.72              0.17        22    0.033       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.050, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.24 GB write, 0.08 MB/s write, 0.23 GB read, 0.08 MB/s read, 3.5 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 26.02 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000174 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1495,25.11 MB,8.26054%) FilterBlock(45,336.48 KB,0.108091%) IndexBlock(45,589.58 KB,0.189395%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 13:31:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:44.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:45.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:31:45 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:31:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:46 np0005596062 nova_compute[227313]: 2026-01-26 18:31:46.924 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:46.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:47.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:47 np0005596062 podman[252889]: 2026-01-26 18:31:47.890551945 +0000 UTC m=+0.100204344 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 13:31:48 np0005596062 nova_compute[227313]: 2026-01-26 18:31:48.499 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:31:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:48.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:31:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:49.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:50.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:31:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:51.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:51 np0005596062 nova_compute[227313]: 2026-01-26 18:31:51.926 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:52.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:53.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:53.264 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:31:53 np0005596062 nova_compute[227313]: 2026-01-26 18:31:53.265 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:53.265 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:31:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:31:53.266 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:31:53 np0005596062 nova_compute[227313]: 2026-01-26 18:31:53.501 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:53Z|00158|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:31:53 np0005596062 nova_compute[227313]: 2026-01-26 18:31:53.898 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:54.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:55.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:55 np0005596062 ovn_controller[133984]: 2026-01-26T18:31:55Z|00159|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:31:55 np0005596062 nova_compute[227313]: 2026-01-26 18:31:55.630 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:31:56 np0005596062 nova_compute[227313]: 2026-01-26 18:31:56.929 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:56.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:57.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:58 np0005596062 nova_compute[227313]: 2026-01-26 18:31:58.505 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:31:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:31:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:31:58.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:31:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:31:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:31:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:31:59.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:00.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:01.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:01 np0005596062 nova_compute[227313]: 2026-01-26 18:32:01.932 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:02.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:03.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:03 np0005596062 nova_compute[227313]: 2026-01-26 18:32:03.508 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:04.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:05.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:06 np0005596062 podman[252975]: 2026-01-26 18:32:06.85588954 +0000 UTC m=+0.062963140 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:32:06 np0005596062 nova_compute[227313]: 2026-01-26 18:32:06.935 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:06.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:07 np0005596062 nova_compute[227313]: 2026-01-26 18:32:07.054 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:07.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:08 np0005596062 nova_compute[227313]: 2026-01-26 18:32:08.510 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:08.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:09 np0005596062 nova_compute[227313]: 2026-01-26 18:32:09.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:09 np0005596062 nova_compute[227313]: 2026-01-26 18:32:09.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:32:09 np0005596062 nova_compute[227313]: 2026-01-26 18:32:09.133 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:32:09 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 26 13:32:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:09.182 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:09.183 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:09.184 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:09.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:10 np0005596062 nova_compute[227313]: 2026-01-26 18:32:10.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:10 np0005596062 nova_compute[227313]: 2026-01-26 18:32:10.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:32:10 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 26 13:32:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:11.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:11 np0005596062 nova_compute[227313]: 2026-01-26 18:32:11.937 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 26 13:32:12 np0005596062 nova_compute[227313]: 2026-01-26 18:32:12.171 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:12 np0005596062 nova_compute[227313]: 2026-01-26 18:32:12.743 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:12 np0005596062 nova_compute[227313]: 2026-01-26 18:32:12.744 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:12 np0005596062 nova_compute[227313]: 2026-01-26 18:32:12.744 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:12 np0005596062 nova_compute[227313]: 2026-01-26 18:32:12.744 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:32:12 np0005596062 nova_compute[227313]: 2026-01-26 18:32:12.744 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:32:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:13.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 26 13:32:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:32:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2779897314' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:32:13 np0005596062 nova_compute[227313]: 2026-01-26 18:32:13.223 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:32:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:13.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:13 np0005596062 nova_compute[227313]: 2026-01-26 18:32:13.513 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:13 np0005596062 nova_compute[227313]: 2026-01-26 18:32:13.973 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:32:13 np0005596062 nova_compute[227313]: 2026-01-26 18:32:13.973 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:32:14 np0005596062 nova_compute[227313]: 2026-01-26 18:32:14.157 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:32:14 np0005596062 nova_compute[227313]: 2026-01-26 18:32:14.158 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4573MB free_disk=20.942455291748047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:32:14 np0005596062 nova_compute[227313]: 2026-01-26 18:32:14.158 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:14 np0005596062 nova_compute[227313]: 2026-01-26 18:32:14.159 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:15.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:15.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.329 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance acd8b26d-b140-49c9-94cc-9d68fd5fa9bd actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.329 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.330 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.458 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.535 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.536 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.557 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.583 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:32:15 np0005596062 nova_compute[227313]: 2026-01-26 18:32:15.628 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:32:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:32:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 15K writes, 56K keys, 15K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 15K writes, 5008 syncs, 3.07 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2939 writes, 9027 keys, 2939 commit groups, 1.0 writes per commit group, ingest: 7.53 MB, 0.01 MB/s#012Interval WAL: 2939 writes, 1197 syncs, 2.46 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:32:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:32:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1647900677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:32:16 np0005596062 nova_compute[227313]: 2026-01-26 18:32:16.102 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:32:16 np0005596062 nova_compute[227313]: 2026-01-26 18:32:16.108 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:32:16 np0005596062 nova_compute[227313]: 2026-01-26 18:32:16.149 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:32:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:16 np0005596062 nova_compute[227313]: 2026-01-26 18:32:16.234 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:32:16 np0005596062 nova_compute[227313]: 2026-01-26 18:32:16.235 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:16 np0005596062 nova_compute[227313]: 2026-01-26 18:32:16.939 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:17.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:17.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:18 np0005596062 nova_compute[227313]: 2026-01-26 18:32:18.521 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:18 np0005596062 podman[253070]: 2026-01-26 18:32:18.706862479 +0000 UTC m=+0.084352101 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 26 13:32:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:19.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:19.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:20 np0005596062 nova_compute[227313]: 2026-01-26 18:32:20.115 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 26 13:32:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:21.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:21.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:21 np0005596062 nova_compute[227313]: 2026-01-26 18:32:21.941 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:23.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:23.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:23 np0005596062 nova_compute[227313]: 2026-01-26 18:32:23.523 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:24 np0005596062 nova_compute[227313]: 2026-01-26 18:32:24.399 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:24 np0005596062 nova_compute[227313]: 2026-01-26 18:32:24.399 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:32:24 np0005596062 nova_compute[227313]: 2026-01-26 18:32:24.399 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:32:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:25.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:25.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:25 np0005596062 nova_compute[227313]: 2026-01-26 18:32:25.753 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:32:25 np0005596062 nova_compute[227313]: 2026-01-26 18:32:25.753 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:32:25 np0005596062 nova_compute[227313]: 2026-01-26 18:32:25.753 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:32:25 np0005596062 nova_compute[227313]: 2026-01-26 18:32:25.754 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid acd8b26d-b140-49c9-94cc-9d68fd5fa9bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:32:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:27 np0005596062 nova_compute[227313]: 2026-01-26 18:32:27.002 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:27.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:27.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:28 np0005596062 nova_compute[227313]: 2026-01-26 18:32:28.219 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:28 np0005596062 nova_compute[227313]: 2026-01-26 18:32:28.525 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:29.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:29.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:30 np0005596062 nova_compute[227313]: 2026-01-26 18:32:30.941 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updating instance_info_cache with network_info: [{"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:32:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:31.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.082 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.083 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.083 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.083 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.083 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.084 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.084 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.084 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.084 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.084 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.254 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:31.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:31 np0005596062 nova_compute[227313]: 2026-01-26 18:32:31.966 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:31.966 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:32:31 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:31.967 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:32:32 np0005596062 nova_compute[227313]: 2026-01-26 18:32:32.004 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:32 np0005596062 nova_compute[227313]: 2026-01-26 18:32:32.686 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:33.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:33.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:33 np0005596062 nova_compute[227313]: 2026-01-26 18:32:33.527 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:33 np0005596062 nova_compute[227313]: 2026-01-26 18:32:33.704 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:35.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:35.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:36 np0005596062 nova_compute[227313]: 2026-01-26 18:32:36.589 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:32:36 np0005596062 nova_compute[227313]: 2026-01-26 18:32:36.616 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Triggering sync for uuid acd8b26d-b140-49c9-94cc-9d68fd5fa9bd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 26 13:32:36 np0005596062 nova_compute[227313]: 2026-01-26 18:32:36.618 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:36 np0005596062 nova_compute[227313]: 2026-01-26 18:32:36.618 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:36 np0005596062 nova_compute[227313]: 2026-01-26 18:32:36.663 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:37 np0005596062 nova_compute[227313]: 2026-01-26 18:32:37.006 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:37.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:37.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:37 np0005596062 podman[253132]: 2026-01-26 18:32:37.857963524 +0000 UTC m=+0.066497085 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 26 13:32:37 np0005596062 nova_compute[227313]: 2026-01-26 18:32:37.907 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:38 np0005596062 nova_compute[227313]: 2026-01-26 18:32:38.529 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:38 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:38.969 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:32:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:39.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:39.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:41.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:41.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.712845) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452361712884, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 976, "num_deletes": 252, "total_data_size": 1991010, "memory_usage": 2017192, "flush_reason": "Manual Compaction"}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452361776123, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1313112, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40785, "largest_seqno": 41756, "table_properties": {"data_size": 1308585, "index_size": 2179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10101, "raw_average_key_size": 19, "raw_value_size": 1299433, "raw_average_value_size": 2568, "num_data_blocks": 96, "num_entries": 506, "num_filter_entries": 506, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452291, "oldest_key_time": 1769452291, "file_creation_time": 1769452361, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 63351 microseconds, and 4392 cpu microseconds.
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.776190) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1313112 bytes OK
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.776215) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.778846) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.778866) EVENT_LOG_v1 {"time_micros": 1769452361778859, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.778883) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1986146, prev total WAL file size 1986146, number of live WAL files 2.
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.779554) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1282KB)], [78(9280KB)]
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452361779622, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 10816442, "oldest_snapshot_seqno": -1}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6241 keys, 8872390 bytes, temperature: kUnknown
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452361831611, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 8872390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8832363, "index_size": 23341, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 161314, "raw_average_key_size": 25, "raw_value_size": 8721664, "raw_average_value_size": 1397, "num_data_blocks": 930, "num_entries": 6241, "num_filter_entries": 6241, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452361, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.831958) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 8872390 bytes
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.833625) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 207.4 rd, 170.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 9.1 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(15.0) write-amplify(6.8) OK, records in: 6762, records dropped: 521 output_compression: NoCompression
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.833646) EVENT_LOG_v1 {"time_micros": 1769452361833637, "job": 48, "event": "compaction_finished", "compaction_time_micros": 52157, "compaction_time_cpu_micros": 21043, "output_level": 6, "num_output_files": 1, "total_output_size": 8872390, "num_input_records": 6762, "num_output_records": 6241, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452361834105, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452361835673, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.779510) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.835762) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.835767) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.835769) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.835771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:32:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:32:41.835772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:32:42 np0005596062 nova_compute[227313]: 2026-01-26 18:32:42.008 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:43.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:43.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:43 np0005596062 nova_compute[227313]: 2026-01-26 18:32:43.531 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:45.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:45.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:45 np0005596062 podman[253378]: 2026-01-26 18:32:45.689248313 +0000 UTC m=+0.166089671 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 26 13:32:45 np0005596062 podman[253378]: 2026-01-26 18:32:45.809384008 +0000 UTC m=+0.286225346 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 13:32:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:46 np0005596062 podman[253537]: 2026-01-26 18:32:46.567773277 +0000 UTC m=+0.144791334 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:32:46 np0005596062 podman[253537]: 2026-01-26 18:32:46.60315188 +0000 UTC m=+0.180169927 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:32:47 np0005596062 nova_compute[227313]: 2026-01-26 18:32:47.010 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:47.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:47.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:48 np0005596062 nova_compute[227313]: 2026-01-26 18:32:48.573 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:48 np0005596062 podman[253601]: 2026-01-26 18:32:48.690865828 +0000 UTC m=+1.633341039 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, description=keepalived for Ceph, release=1793, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, version=2.2.4, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2)
Jan 26 13:32:48 np0005596062 podman[253624]: 2026-01-26 18:32:48.793012283 +0000 UTC m=+0.078123575 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, architecture=x86_64, version=2.2.4, distribution-scope=public, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Jan 26 13:32:48 np0005596062 podman[253601]: 2026-01-26 18:32:48.800202475 +0000 UTC m=+1.742677686 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, architecture=x86_64, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, distribution-scope=public)
Jan 26 13:32:48 np0005596062 podman[253637]: 2026-01-26 18:32:48.885300245 +0000 UTC m=+0.092769086 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 13:32:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:49.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:49.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:51.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:51.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 26 13:32:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:32:51 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:32:52 np0005596062 nova_compute[227313]: 2026-01-26 18:32:52.062 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:52 np0005596062 ovn_controller[133984]: 2026-01-26T18:32:52Z|00160|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:32:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:32:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:32:52 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:32:52 np0005596062 nova_compute[227313]: 2026-01-26 18:32:52.659 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:52 np0005596062 ovn_controller[133984]: 2026-01-26T18:32:52Z|00161|binding|INFO|Releasing lport 57b3bf6f-2b11-4a16-b6fc-c8194ada158a from this chassis (sb_readonly=0)
Jan 26 13:32:52 np0005596062 nova_compute[227313]: 2026-01-26 18:32:52.921 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:53.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:32:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:53.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:32:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 26 13:32:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 13:32:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 26 13:32:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 26 13:32:53 np0005596062 radosgw[83289]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 26 13:32:53 np0005596062 nova_compute[227313]: 2026-01-26 18:32:53.575 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.031 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.032 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.032 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.032 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.032 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.033 227317 INFO nova.compute.manager [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Terminating instance#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.034 227317 DEBUG nova.compute.manager [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:32:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:55.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:55 np0005596062 kernel: tap95a388aa-20 (unregistering): left promiscuous mode
Jan 26 13:32:55 np0005596062 NetworkManager[48993]: <info>  [1769452375.1057] device (tap95a388aa-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.114 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 ovn_controller[133984]: 2026-01-26T18:32:55Z|00162|binding|INFO|Releasing lport 95a388aa-20fa-4cf0-a52b-c0f58db57705 from this chassis (sb_readonly=0)
Jan 26 13:32:55 np0005596062 ovn_controller[133984]: 2026-01-26T18:32:55Z|00163|binding|INFO|Setting lport 95a388aa-20fa-4cf0-a52b-c0f58db57705 down in Southbound
Jan 26 13:32:55 np0005596062 ovn_controller[133984]: 2026-01-26T18:32:55Z|00164|binding|INFO|Removing iface tap95a388aa-20 ovn-installed in OVS
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.122 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.138 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.157 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:ec:da 10.100.0.10'], port_security=['fa:16:3e:30:ec:da 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'acd8b26d-b140-49c9-94cc-9d68fd5fa9bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6ffe169-5606-433e-936f-c0a2554b460d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b158396183b64160b56d3c4df4ae6550', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0f7a06fa-e94c-4ca3-aef6-af3df08dfe1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd476994-76c7-4ad1-88ba-247776af23a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=95a388aa-20fa-4cf0-a52b-c0f58db57705) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.159 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 95a388aa-20fa-4cf0-a52b-c0f58db57705 in datapath a6ffe169-5606-433e-936f-c0a2554b460d unbound from our chassis#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.160 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6ffe169-5606-433e-936f-c0a2554b460d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.161 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[28e77473-d559-4720-a31a-9a14f8925cb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.162 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d namespace which is not needed anymore#033[00m
Jan 26 13:32:55 np0005596062 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 26 13:32:55 np0005596062 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 18.516s CPU time.
Jan 26 13:32:55 np0005596062 systemd-machined[195380]: Machine qemu-16-instance-00000015 terminated.
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.269 227317 INFO nova.virt.libvirt.driver [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Instance destroyed successfully.#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.270 227317 DEBUG nova.objects.instance [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lazy-loading 'resources' on Instance uuid acd8b26d-b140-49c9-94cc-9d68fd5fa9bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:32:55 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [NOTICE]   (252598) : haproxy version is 2.8.14-c23fe91
Jan 26 13:32:55 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [NOTICE]   (252598) : path to executable is /usr/sbin/haproxy
Jan 26 13:32:55 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [WARNING]  (252598) : Exiting Master process...
Jan 26 13:32:55 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [ALERT]    (252598) : Current worker (252609) exited with code 143 (Terminated)
Jan 26 13:32:55 np0005596062 neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d[252570]: [WARNING]  (252598) : All workers exited. Exiting... (0)
Jan 26 13:32:55 np0005596062 systemd[1]: libpod-47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce.scope: Deactivated successfully.
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.298 227317 DEBUG nova.virt.libvirt.vif [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:31:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1269771433',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1269771433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1269771433',id=21,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInDz+GLYJtk95PlzrHqTNQERTx1bynGIuwweyi5YHrc/aXQ2pURgEiq/Gs5/yI9jMLkStr288XvF2jTexSQOlBlxgRG2TOYDl2OkNKCPigp/5UAqde30Xz/zQyXB4jPlQ==',key_name='tempest-keypair-809081161',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:31:25Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b158396183b64160b56d3c4df4ae6550',ramdisk_id='',reservation_id='r-fta0379e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-1200254346',owner_user_name='tempest-TestVolumeBootPattern-1200254346-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:31:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1859ed83e26a48fdadcb5b9899dae46e',uuid=acd8b26d-b140-49c9-94cc-9d68fd5fa9bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:32:55 np0005596062 podman[253822]: 2026-01-26 18:32:55.299674851 +0000 UTC m=+0.042925186 container died 47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.299 227317 DEBUG nova.network.os_vif_util [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converting VIF {"id": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "address": "fa:16:3e:30:ec:da", "network": {"id": "a6ffe169-5606-433e-936f-c0a2554b460d", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-318704212-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b158396183b64160b56d3c4df4ae6550", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95a388aa-20", "ovs_interfaceid": "95a388aa-20fa-4cf0-a52b-c0f58db57705", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.301 227317 DEBUG nova.network.os_vif_util [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.302 227317 DEBUG os_vif [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.303 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.303 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95a388aa-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.305 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.306 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.309 227317 INFO os_vif [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:ec:da,bridge_name='br-int',has_traffic_filtering=True,id=95a388aa-20fa-4cf0-a52b-c0f58db57705,network=Network(a6ffe169-5606-433e-936f-c0a2554b460d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95a388aa-20')#033[00m
Jan 26 13:32:55 np0005596062 systemd[1]: var-lib-containers-storage-overlay-0a3f2a5997f46f2d014a1054fe1bd7972795996fb83493bb1dbc80a05dead548-merged.mount: Deactivated successfully.
Jan 26 13:32:55 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce-userdata-shm.mount: Deactivated successfully.
Jan 26 13:32:55 np0005596062 podman[253822]: 2026-01-26 18:32:55.345030501 +0000 UTC m=+0.088280836 container cleanup 47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:32:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:55.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:55 np0005596062 systemd[1]: libpod-conmon-47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce.scope: Deactivated successfully.
Jan 26 13:32:55 np0005596062 podman[253879]: 2026-01-26 18:32:55.410898638 +0000 UTC m=+0.045021632 container remove 47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.416 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[10dedd1a-692d-44c5-8873-43d2debe6894]: (4, ('Mon Jan 26 06:32:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d (47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce)\n47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce\nMon Jan 26 06:32:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d (47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce)\n47ce5129d7826b181d862ea883be534ee020d1f87738e1d7d59efd713309ecce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.417 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9d021bec-4b54-49d2-8f36-c792ec46fec8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.419 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6ffe169-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.465 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 kernel: tapa6ffe169-50: left promiscuous mode
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.480 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.483 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7707eb26-8861-4ab2-b19a-727a9ba5ee8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.495 227317 INFO nova.virt.libvirt.driver [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Deleting instance files /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_del#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.496 227317 INFO nova.virt.libvirt.driver [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Deletion of /var/lib/nova/instances/acd8b26d-b140-49c9-94cc-9d68fd5fa9bd_del complete#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.501 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a72ce4b7-9490-4a17-8d80-3313f0652643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.503 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4bf0d6-8271-4664-9e48-49b7bc0e4ed7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.518 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[06be85db-1886-48af-94b1-5d662e1fa3de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604467, 'reachable_time': 23569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253895, 'error': None, 'target': 'ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.521 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6ffe169-5606-433e-936f-c0a2554b460d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:32:55 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:32:55.521 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[198405c7-254a-487f-a506-d76c5b7181ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:32:55 np0005596062 systemd[1]: run-netns-ovnmeta\x2da6ffe169\x2d5606\x2d433e\x2d936f\x2dc0a2554b460d.mount: Deactivated successfully.
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.628 227317 INFO nova.compute.manager [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.629 227317 DEBUG oslo.service.loopingcall [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.629 227317 DEBUG nova.compute.manager [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:32:55 np0005596062 nova_compute[227313]: 2026-01-26 18:32:55.629 227317 DEBUG nova.network.neutron [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:32:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:32:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 26 13:32:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:57.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:57 np0005596062 nova_compute[227313]: 2026-01-26 18:32:57.064 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:32:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:57.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:57 np0005596062 nova_compute[227313]: 2026-01-26 18:32:57.470 227317 DEBUG nova.network.neutron [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:32:57 np0005596062 nova_compute[227313]: 2026-01-26 18:32:57.497 227317 INFO nova.compute.manager [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Took 1.87 seconds to deallocate network for instance.#033[00m
Jan 26 13:32:57 np0005596062 nova_compute[227313]: 2026-01-26 18:32:57.653 227317 DEBUG nova.compute.manager [req-e236d3b6-602f-4fc0-b0a3-8e4aa24dcf38 req-d2a96d86-24dd-4959-a1d4-5b278ef0f3a1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-vif-deleted-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.001 227317 INFO nova.compute.manager [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Took 0.50 seconds to detach 1 volumes for instance.#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.002 227317 DEBUG nova.compute.manager [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Deleting volume: f8a2a75e-f2e1-4a92-96d8-39cd07be8f2c _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.211 227317 DEBUG nova.compute.manager [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-vif-unplugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.212 227317 DEBUG oslo_concurrency.lockutils [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.212 227317 DEBUG oslo_concurrency.lockutils [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.212 227317 DEBUG oslo_concurrency.lockutils [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.213 227317 DEBUG nova.compute.manager [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] No waiting events found dispatching network-vif-unplugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.213 227317 DEBUG nova.compute.manager [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-vif-unplugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.213 227317 DEBUG nova.compute.manager [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.213 227317 DEBUG oslo_concurrency.lockutils [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.213 227317 DEBUG oslo_concurrency.lockutils [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.214 227317 DEBUG oslo_concurrency.lockutils [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.214 227317 DEBUG nova.compute.manager [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] No waiting events found dispatching network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.214 227317 WARNING nova.compute.manager [req-2ec754d7-e9a9-4b28-8290-5cf709238fd7 req-500fa2db-617a-4608-b012-5fabf29277a9 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Received unexpected event network-vif-plugged-95a388aa-20fa-4cf0-a52b-c0f58db57705 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.316 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.316 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.394 227317 DEBUG oslo_concurrency.processutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:32:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:32:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1699268378' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.888 227317 DEBUG oslo_concurrency.processutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:32:58 np0005596062 nova_compute[227313]: 2026-01-26 18:32:58.895 227317 DEBUG nova.compute.provider_tree [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:32:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:32:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:32:59.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:32:59 np0005596062 nova_compute[227313]: 2026-01-26 18:32:59.146 227317 DEBUG nova.scheduler.client.report [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:32:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:32:59 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:32:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:32:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:32:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:32:59.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:32:59 np0005596062 nova_compute[227313]: 2026-01-26 18:32:59.355 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:32:59 np0005596062 nova_compute[227313]: 2026-01-26 18:32:59.569 227317 INFO nova.scheduler.client.report [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Deleted allocations for instance acd8b26d-b140-49c9-94cc-9d68fd5fa9bd#033[00m
Jan 26 13:32:59 np0005596062 nova_compute[227313]: 2026-01-26 18:32:59.745 227317 DEBUG oslo_concurrency.lockutils [None req-6cdd679c-e305-4cd7-9709-0deb035d50be 1859ed83e26a48fdadcb5b9899dae46e b158396183b64160b56d3c4df4ae6550 - - default default] Lock "acd8b26d-b140-49c9-94cc-9d68fd5fa9bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:00 np0005596062 nova_compute[227313]: 2026-01-26 18:33:00.306 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 26 13:33:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:01.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:01.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:02 np0005596062 nova_compute[227313]: 2026-01-26 18:33:02.066 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:03.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:03.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:05.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:05 np0005596062 nova_compute[227313]: 2026-01-26 18:33:05.308 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:05.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:05 np0005596062 nova_compute[227313]: 2026-01-26 18:33:05.649 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 26 13:33:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:07.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:07 np0005596062 nova_compute[227313]: 2026-01-26 18:33:07.068 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:33:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:07.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:33:08 np0005596062 podman[254025]: 2026-01-26 18:33:08.857723565 +0000 UTC m=+0.060998618 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 26 13:33:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:09.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:09.183 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:09.183 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:09.183 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:09.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:10 np0005596062 nova_compute[227313]: 2026-01-26 18:33:10.269 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452375.267365, acd8b26d-b140-49c9-94cc-9d68fd5fa9bd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:33:10 np0005596062 nova_compute[227313]: 2026-01-26 18:33:10.269 227317 INFO nova.compute.manager [-] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:33:10 np0005596062 nova_compute[227313]: 2026-01-26 18:33:10.293 227317 DEBUG nova.compute.manager [None req-287565d6-82a0-49a4-8d7a-f3f74fb955ae - - - - - -] [instance: acd8b26d-b140-49c9-94cc-9d68fd5fa9bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:33:10 np0005596062 nova_compute[227313]: 2026-01-26 18:33:10.309 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:10 np0005596062 nova_compute[227313]: 2026-01-26 18:33:10.337 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:11.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:11.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.070 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.085 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.086 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.086 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.086 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.087 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:33:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/633723403' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.517 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.679 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.680 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4716MB free_disk=20.942623138427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.680 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.680 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.857 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.858 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:33:12 np0005596062 nova_compute[227313]: 2026-01-26 18:33:12.881 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:13.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:33:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3388797487' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:33:13 np0005596062 nova_compute[227313]: 2026-01-26 18:33:13.344 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:13 np0005596062 nova_compute[227313]: 2026-01-26 18:33:13.351 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:33:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:13.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:13 np0005596062 nova_compute[227313]: 2026-01-26 18:33:13.383 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:33:13 np0005596062 nova_compute[227313]: 2026-01-26 18:33:13.418 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:33:13 np0005596062 nova_compute[227313]: 2026-01-26 18:33:13.419 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:13 np0005596062 nova_compute[227313]: 2026-01-26 18:33:13.488 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:15.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:15 np0005596062 nova_compute[227313]: 2026-01-26 18:33:15.339 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:15.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:16 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 13:33:16 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 13:33:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:17 np0005596062 nova_compute[227313]: 2026-01-26 18:33:17.072 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:17.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:17.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:19.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:19 np0005596062 podman[254122]: 2026-01-26 18:33:19.232366789 +0000 UTC m=+0.105418013 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.345 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:19.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.419 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.420 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.420 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.514 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.514 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.515 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.515 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.515 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.860 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.860 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:19 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.907 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:19.999 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.000 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.005 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.006 227317 INFO nova.compute.claims [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.282 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.342 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:33:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3487180798' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.714 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.719 227317 DEBUG nova.compute.provider_tree [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.781 227317 DEBUG nova.scheduler.client.report [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.958 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:20 np0005596062 nova_compute[227313]: 2026-01-26 18:33:20.960 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.027 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.028 227317 DEBUG nova.network.neutron [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.055 227317 INFO nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:33:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:21.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.093 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.300 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.301 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.302 227317 INFO nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Creating image(s)#033[00m
Jan 26 13:33:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:21.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.425 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.454 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.480 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.483 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.507 227317 DEBUG nova.policy [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffa1cd7ba9e543f78f2ef48c2a7a67a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '301bad5c2066428fa7f214024672bf92', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.543 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.544 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.544 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.545 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.570 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:21 np0005596062 nova_compute[227313]: 2026-01-26 18:33:21.573 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 b43c1568-f367-4a8e-beda-27b963ce3769_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.003 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 b43c1568-f367-4a8e-beda-27b963ce3769_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.074 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.079 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] resizing rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.201 227317 DEBUG nova.objects.instance [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'migration_context' on Instance uuid b43c1568-f367-4a8e-beda-27b963ce3769 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.226 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.226 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Ensure instance console log exists: /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.227 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.227 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.228 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:22.945 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:33:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:22.946 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:33:22 np0005596062 nova_compute[227313]: 2026-01-26 18:33:22.946 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:23.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:23 np0005596062 nova_compute[227313]: 2026-01-26 18:33:23.189 227317 DEBUG nova.network.neutron [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Successfully created port: 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:33:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:23.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:23 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:23.947 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:24 np0005596062 nova_compute[227313]: 2026-01-26 18:33:24.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:24 np0005596062 nova_compute[227313]: 2026-01-26 18:33:24.623 227317 DEBUG nova.network.neutron [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Successfully updated port: 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:33:24 np0005596062 nova_compute[227313]: 2026-01-26 18:33:24.643 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:33:24 np0005596062 nova_compute[227313]: 2026-01-26 18:33:24.643 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquired lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:33:24 np0005596062 nova_compute[227313]: 2026-01-26 18:33:24.644 227317 DEBUG nova.network.neutron [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:33:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:25.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:25 np0005596062 nova_compute[227313]: 2026-01-26 18:33:25.121 227317 DEBUG nova.compute.manager [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-changed-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:33:25 np0005596062 nova_compute[227313]: 2026-01-26 18:33:25.121 227317 DEBUG nova.compute.manager [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Refreshing instance network info cache due to event network-changed-3cc10671-fb0e-4d3d-9b4e-93636e6c4238. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:33:25 np0005596062 nova_compute[227313]: 2026-01-26 18:33:25.121 227317 DEBUG oslo_concurrency.lockutils [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:33:25 np0005596062 nova_compute[227313]: 2026-01-26 18:33:25.206 227317 DEBUG nova.network.neutron [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:33:25 np0005596062 nova_compute[227313]: 2026-01-26 18:33:25.343 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:25.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:33:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.797 227317 DEBUG nova.network.neutron [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updating instance_info_cache with network_info: [{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.861 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Releasing lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.861 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Instance network_info: |[{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.862 227317 DEBUG oslo_concurrency.lockutils [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.862 227317 DEBUG nova.network.neutron [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Refreshing network info cache for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.865 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Start _get_guest_xml network_info=[{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.869 227317 WARNING nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.882 227317 DEBUG nova.virt.libvirt.host [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.883 227317 DEBUG nova.virt.libvirt.host [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.887 227317 DEBUG nova.virt.libvirt.host [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.887 227317 DEBUG nova.virt.libvirt.host [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.889 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.889 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.890 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.890 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.890 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.891 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.891 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.891 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.891 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.892 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.892 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.892 227317 DEBUG nova.virt.hardware [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:33:26 np0005596062 nova_compute[227313]: 2026-01-26 18:33:26.894 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:27 np0005596062 nova_compute[227313]: 2026-01-26 18:33:27.076 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:27.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:27.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:33:27 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/966119763' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:33:27 np0005596062 nova_compute[227313]: 2026-01-26 18:33:27.712 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.817s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:27 np0005596062 nova_compute[227313]: 2026-01-26 18:33:27.736 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:27 np0005596062 nova_compute[227313]: 2026-01-26 18:33:27.740 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:33:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4066242479' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.168 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.170 227317 DEBUG nova.virt.libvirt.vif [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1719150974',display_name='tempest-TestNetworkAdvancedServerOps-server-1719150974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1719150974',id=23,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDEOHm6WKCKYdZcYs7kRuLusjif5ojXniiJMhrJrHg7YfxMMa9vKhpHePgKZBFWmUylH+aD0GyBeL4fpDu6rEGxI0F93dA9XsN5JYScbkt/Ge6Oa00kHQ2bRd1K4UHVxMw==',key_name='tempest-TestNetworkAdvancedServerOps-1342297099',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-51db0pi2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:33:21Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b43c1568-f367-4a8e-beda-27b963ce3769,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.171 227317 DEBUG nova.network.os_vif_util [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.172 227317 DEBUG nova.network.os_vif_util [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.174 227317 DEBUG nova.objects.instance [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'pci_devices' on Instance uuid b43c1568-f367-4a8e-beda-27b963ce3769 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.204 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <uuid>b43c1568-f367-4a8e-beda-27b963ce3769</uuid>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <name>instance-00000017</name>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1719150974</nova:name>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:33:26</nova:creationTime>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:user uuid="ffa1cd7ba9e543f78f2ef48c2a7a67a2">tempest-TestNetworkAdvancedServerOps-1357272614-project-member</nova:user>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:project uuid="301bad5c2066428fa7f214024672bf92">tempest-TestNetworkAdvancedServerOps-1357272614</nova:project>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <nova:port uuid="3cc10671-fb0e-4d3d-9b4e-93636e6c4238">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <entry name="serial">b43c1568-f367-4a8e-beda-27b963ce3769</entry>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <entry name="uuid">b43c1568-f367-4a8e-beda-27b963ce3769</entry>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/b43c1568-f367-4a8e-beda-27b963ce3769_disk">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/b43c1568-f367-4a8e-beda-27b963ce3769_disk.config">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:78:cc:99"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <target dev="tap3cc10671-fb"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/console.log" append="off"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:33:28 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:33:28 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:33:28 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:33:28 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.205 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Preparing to wait for external event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.205 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.206 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.206 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.206 227317 DEBUG nova.virt.libvirt.vif [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1719150974',display_name='tempest-TestNetworkAdvancedServerOps-server-1719150974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1719150974',id=23,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDEOHm6WKCKYdZcYs7kRuLusjif5ojXniiJMhrJrHg7YfxMMa9vKhpHePgKZBFWmUylH+aD0GyBeL4fpDu6rEGxI0F93dA9XsN5JYScbkt/Ge6Oa00kHQ2bRd1K4UHVxMw==',key_name='tempest-TestNetworkAdvancedServerOps-1342297099',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-51db0pi2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:33:21Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b43c1568-f367-4a8e-beda-27b963ce3769,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.207 227317 DEBUG nova.network.os_vif_util [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.207 227317 DEBUG nova.network.os_vif_util [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.207 227317 DEBUG os_vif [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.208 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.208 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.209 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.212 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.212 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cc10671-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.213 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cc10671-fb, col_values=(('external_ids', {'iface-id': '3cc10671-fb0e-4d3d-9b4e-93636e6c4238', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:cc:99', 'vm-uuid': 'b43c1568-f367-4a8e-beda-27b963ce3769'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.215 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:28 np0005596062 NetworkManager[48993]: <info>  [1769452408.2163] manager: (tap3cc10671-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.218 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.225 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.226 227317 INFO os_vif [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb')#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.285 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.285 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.285 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No VIF found with MAC fa:16:3e:78:cc:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.285 227317 INFO nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Using config drive#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.308 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.458 227317 DEBUG nova.network.neutron [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updated VIF entry in instance network info cache for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.458 227317 DEBUG nova.network.neutron [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updating instance_info_cache with network_info: [{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.504 227317 DEBUG oslo_concurrency.lockutils [req-2c83e6d4-017e-45ea-8c08-0f621f48ec2c req-e3b1ae2a-7ab8-49d6-8150-5f9c7d349446 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.929 227317 INFO nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Creating config drive at /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/disk.config#033[00m
Jan 26 13:33:28 np0005596062 nova_compute[227313]: 2026-01-26 18:33:28.933 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt2_43mph execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.063 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt2_43mph" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:29.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.092 227317 DEBUG nova.storage.rbd_utils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b43c1568-f367-4a8e-beda-27b963ce3769_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.097 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/disk.config b43c1568-f367-4a8e-beda-27b963ce3769_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:33:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:29.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.415 227317 DEBUG oslo_concurrency.processutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/disk.config b43c1568-f367-4a8e-beda-27b963ce3769_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.417 227317 INFO nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Deleting local config drive /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769/disk.config because it was imported into RBD.#033[00m
Jan 26 13:33:29 np0005596062 kernel: tap3cc10671-fb: entered promiscuous mode
Jan 26 13:33:29 np0005596062 NetworkManager[48993]: <info>  [1769452409.4757] manager: (tap3cc10671-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Jan 26 13:33:29 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:29Z|00165|binding|INFO|Claiming lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for this chassis.
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.475 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:29 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:29Z|00166|binding|INFO|3cc10671-fb0e-4d3d-9b4e-93636e6c4238: Claiming fa:16:3e:78:cc:99 10.100.0.9
Jan 26 13:33:29 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:29Z|00167|binding|INFO|Setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 ovn-installed in OVS
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.490 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.493 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:29 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:29Z|00168|binding|INFO|Setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 up in Southbound
Jan 26 13:33:29 np0005596062 systemd-udevd[254500]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:33:29 np0005596062 systemd-machined[195380]: New machine qemu-17-instance-00000017.
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.504 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:cc:99 10.100.0.9'], port_security=['fa:16:3e:78:cc:99 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b43c1568-f367-4a8e-beda-27b963ce3769', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15f2d772-da47-4c77-8357-41c40294bae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '2', 'neutron:security_group_ids': '483e03f0-e8ac-4a81-9798-744393eb00da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bbb333e-4739-4b8f-a299-8418443b07cd, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3cc10671-fb0e-4d3d-9b4e-93636e6c4238) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.505 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 in datapath 15f2d772-da47-4c77-8357-41c40294bae5 bound to our chassis#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.507 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 15f2d772-da47-4c77-8357-41c40294bae5#033[00m
Jan 26 13:33:29 np0005596062 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.518 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[acb6c653-4aa8-407c-a02f-bbb3d2ee8973]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.519 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap15f2d772-d1 in ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.521 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap15f2d772-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.521 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5a92eeeb-528c-4bb0-9a09-adfbc973c6ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 NetworkManager[48993]: <info>  [1769452409.5231] device (tap3cc10671-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:33:29 np0005596062 NetworkManager[48993]: <info>  [1769452409.5237] device (tap3cc10671-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.522 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0e035f8a-8737-4fc8-8a02-7990cf0402cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.535 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[60cecce1-da00-4c2a-bd9c-baeb77680ec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.550 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[41b639e4-51b5-4371-a230-6756f275a954]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.581 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[97488ca9-1332-46ca-ab56-24b32035005f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.588 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[87929277-547d-4363-9eb3-e3cf4ae616c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 NetworkManager[48993]: <info>  [1769452409.5912] manager: (tap15f2d772-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Jan 26 13:33:29 np0005596062 systemd-udevd[254503]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.620 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[dd93327f-7960-4c25-822b-5994c37dcf06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.624 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5a3c64-6f8d-4fca-9919-5cbb5a89af5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 NetworkManager[48993]: <info>  [1769452409.6425] device (tap15f2d772-d0): carrier: link connected
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.646 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[9dead6a4-7518-4031-bb10-ed6483a36b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.661 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ea56cbbd-beec-4923-b731-ceacfd0eb0ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15f2d772-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:f4:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616969, 'reachable_time': 17623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254535, 'error': None, 'target': 'ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.677 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6d93f2c2-9006-481a-b673-6703ab23f642]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:f4dd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616969, 'tstamp': 616969}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254536, 'error': None, 'target': 'ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.694 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[68fb85fc-0d83-472a-b962-cab8b7c996a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15f2d772-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:f4:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616969, 'reachable_time': 17623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254537, 'error': None, 'target': 'ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.724 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5a8646bf-91c2-4b70-a0a3-6360cc7c086c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.790 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1f7ed5-5608-4de8-afee-4c0660aa40dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.792 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15f2d772-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.792 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.792 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15f2d772-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.794 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:29 np0005596062 NetworkManager[48993]: <info>  [1769452409.7949] manager: (tap15f2d772-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 26 13:33:29 np0005596062 kernel: tap15f2d772-d0: entered promiscuous mode
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.797 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap15f2d772-d0, col_values=(('external_ids', {'iface-id': '8e79441f-a5e8-497c-8f8b-e77378840eed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.798 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:29 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:29Z|00169|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.811 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.812 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/15f2d772-da47-4c77-8357-41c40294bae5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/15f2d772-da47-4c77-8357-41c40294bae5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.813 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[356cea2f-6031-4ffd-b70b-e0fbe7107e7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.814 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-15f2d772-da47-4c77-8357-41c40294bae5
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/15f2d772-da47-4c77-8357-41c40294bae5.pid.haproxy
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 15f2d772-da47-4c77-8357-41c40294bae5
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:33:29 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:33:29.815 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5', 'env', 'PROCESS_TAG=haproxy-15f2d772-da47-4c77-8357-41c40294bae5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/15f2d772-da47-4c77-8357-41c40294bae5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.988 227317 DEBUG nova.compute.manager [req-25f6dab8-25a9-4bdb-ae5c-2f15d4adfa1e req-452601f8-2f04-42ae-8d2b-d0eb8b139442 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.989 227317 DEBUG oslo_concurrency.lockutils [req-25f6dab8-25a9-4bdb-ae5c-2f15d4adfa1e req-452601f8-2f04-42ae-8d2b-d0eb8b139442 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.989 227317 DEBUG oslo_concurrency.lockutils [req-25f6dab8-25a9-4bdb-ae5c-2f15d4adfa1e req-452601f8-2f04-42ae-8d2b-d0eb8b139442 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.990 227317 DEBUG oslo_concurrency.lockutils [req-25f6dab8-25a9-4bdb-ae5c-2f15d4adfa1e req-452601f8-2f04-42ae-8d2b-d0eb8b139442 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:29 np0005596062 nova_compute[227313]: 2026-01-26 18:33:29.990 227317 DEBUG nova.compute.manager [req-25f6dab8-25a9-4bdb-ae5c-2f15d4adfa1e req-452601f8-2f04-42ae-8d2b-d0eb8b139442 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Processing event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:33:30 np0005596062 podman[254594]: 2026-01-26 18:33:30.179301714 +0000 UTC m=+0.061011189 container create 3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 26 13:33:30 np0005596062 systemd[1]: Started libpod-conmon-3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1.scope.
Jan 26 13:33:30 np0005596062 podman[254594]: 2026-01-26 18:33:30.149353325 +0000 UTC m=+0.031062900 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.247 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452410.2466998, b43c1568-f367-4a8e-beda-27b963ce3769 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.247 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] VM Started (Lifecycle Event)#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.250 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.253 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.258 227317 INFO nova.virt.libvirt.driver [-] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Instance spawned successfully.#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.259 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:33:30 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:33:30 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4bd6e8a9065bd971a4aa0640968a1ff109a5815285517f8775b948a121827a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:33:30 np0005596062 podman[254594]: 2026-01-26 18:33:30.292240966 +0000 UTC m=+0.173950461 container init 3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:33:30 np0005596062 podman[254594]: 2026-01-26 18:33:30.298844262 +0000 UTC m=+0.180553737 container start 3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.307 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.308 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.309 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.310 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.310 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.311 227317 DEBUG nova.virt.libvirt.driver [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.317 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:33:30 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [NOTICE]   (254630) : New worker (254632) forked
Jan 26 13:33:30 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [NOTICE]   (254630) : Loading success.
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.322 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.378 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.379 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452410.2468712, b43c1568-f367-4a8e-beda-27b963ce3769 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.379 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.413 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.417 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452410.2526698, b43c1568-f367-4a8e-beda-27b963ce3769 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.417 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.421 227317 INFO nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Took 9.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.421 227317 DEBUG nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.448 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.451 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.499 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.511 227317 INFO nova.compute.manager [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Took 10.55 seconds to build instance.#033[00m
Jan 26 13:33:30 np0005596062 nova_compute[227313]: 2026-01-26 18:33:30.526 227317 DEBUG oslo_concurrency.lockutils [None req-dfb386b6-55a5-4eb6-b537-1c445e4988bc ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:31.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:31.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.079 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.191 227317 DEBUG nova.compute.manager [req-f7af7536-39b9-4c54-bb92-8add4ca13398 req-9c804be1-4693-47db-a820-009ea5cacd00 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.191 227317 DEBUG oslo_concurrency.lockutils [req-f7af7536-39b9-4c54-bb92-8add4ca13398 req-9c804be1-4693-47db-a820-009ea5cacd00 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.192 227317 DEBUG oslo_concurrency.lockutils [req-f7af7536-39b9-4c54-bb92-8add4ca13398 req-9c804be1-4693-47db-a820-009ea5cacd00 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.192 227317 DEBUG oslo_concurrency.lockutils [req-f7af7536-39b9-4c54-bb92-8add4ca13398 req-9c804be1-4693-47db-a820-009ea5cacd00 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.193 227317 DEBUG nova.compute.manager [req-f7af7536-39b9-4c54-bb92-8add4ca13398 req-9c804be1-4693-47db-a820-009ea5cacd00 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:33:32 np0005596062 nova_compute[227313]: 2026-01-26 18:33:32.193 227317 WARNING nova.compute.manager [req-f7af7536-39b9-4c54-bb92-8add4ca13398 req-9c804be1-4693-47db-a820-009ea5cacd00 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:33:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:33.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:33 np0005596062 nova_compute[227313]: 2026-01-26 18:33:33.215 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:33.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:35.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:35.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:35Z|00170|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:33:35 np0005596062 nova_compute[227313]: 2026-01-26 18:33:35.699 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:35 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:35Z|00171|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:33:35 np0005596062 nova_compute[227313]: 2026-01-26 18:33:35.960 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:37 np0005596062 nova_compute[227313]: 2026-01-26 18:33:37.081 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:37.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:37.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:38 np0005596062 nova_compute[227313]: 2026-01-26 18:33:38.217 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:38 np0005596062 NetworkManager[48993]: <info>  [1769452418.2500] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 26 13:33:38 np0005596062 nova_compute[227313]: 2026-01-26 18:33:38.249 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:38 np0005596062 NetworkManager[48993]: <info>  [1769452418.2509] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 26 13:33:38 np0005596062 nova_compute[227313]: 2026-01-26 18:33:38.429 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:38 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:38Z|00172|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:33:38 np0005596062 nova_compute[227313]: 2026-01-26 18:33:38.448 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:39.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:39 np0005596062 nova_compute[227313]: 2026-01-26 18:33:39.311 227317 DEBUG nova.compute.manager [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-changed-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:33:39 np0005596062 nova_compute[227313]: 2026-01-26 18:33:39.312 227317 DEBUG nova.compute.manager [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Refreshing instance network info cache due to event network-changed-3cc10671-fb0e-4d3d-9b4e-93636e6c4238. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:33:39 np0005596062 nova_compute[227313]: 2026-01-26 18:33:39.312 227317 DEBUG oslo_concurrency.lockutils [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:33:39 np0005596062 nova_compute[227313]: 2026-01-26 18:33:39.313 227317 DEBUG oslo_concurrency.lockutils [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:33:39 np0005596062 nova_compute[227313]: 2026-01-26 18:33:39.313 227317 DEBUG nova.network.neutron [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Refreshing network info cache for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:33:39 np0005596062 podman[254671]: 2026-01-26 18:33:39.386765703 +0000 UTC m=+0.087222297 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:33:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:39.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:39 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:39Z|00173|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:33:39 np0005596062 nova_compute[227313]: 2026-01-26 18:33:39.572 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:41.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:41.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:41 np0005596062 nova_compute[227313]: 2026-01-26 18:33:41.915 227317 DEBUG nova.network.neutron [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updated VIF entry in instance network info cache for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:33:41 np0005596062 nova_compute[227313]: 2026-01-26 18:33:41.917 227317 DEBUG nova.network.neutron [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updating instance_info_cache with network_info: [{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:33:42 np0005596062 nova_compute[227313]: 2026-01-26 18:33:42.029 227317 DEBUG oslo_concurrency.lockutils [req-158b3aee-0149-4eed-9b92-284c29634c3d req-4cf8fd56-d187-41bd-a7fe-16714b6a9801 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:33:42 np0005596062 nova_compute[227313]: 2026-01-26 18:33:42.123 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:43.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:43 np0005596062 nova_compute[227313]: 2026-01-26 18:33:43.220 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:43.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:43 np0005596062 nova_compute[227313]: 2026-01-26 18:33:43.983 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:44Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:cc:99 10.100.0.9
Jan 26 13:33:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:44Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:cc:99 10.100.0.9
Jan 26 13:33:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:45.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:45.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:33:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:47.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:33:47 np0005596062 nova_compute[227313]: 2026-01-26 18:33:47.123 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:47.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:48 np0005596062 ovn_controller[133984]: 2026-01-26T18:33:48Z|00174|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:33:48 np0005596062 nova_compute[227313]: 2026-01-26 18:33:48.222 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:48 np0005596062 nova_compute[227313]: 2026-01-26 18:33:48.261 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:33:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:49.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:33:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:49.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:49 np0005596062 podman[254720]: 2026-01-26 18:33:49.875667691 +0000 UTC m=+0.086775346 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:33:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:51.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.157 227317 INFO nova.compute.manager [None req-e185bfe4-5a68-4194-8409-72b556b3b755 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Get console output#033[00m
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.165 227317 INFO oslo.privsep.daemon [None req-e185bfe4-5a68-4194-8409-72b556b3b755 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpv27o_c3d/privsep.sock']#033[00m
Jan 26 13:33:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:51.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.502 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.845 227317 INFO oslo.privsep.daemon [None req-e185bfe4-5a68-4194-8409-72b556b3b755 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.737 254751 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.744 254751 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.747 254751 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.747 254751 INFO oslo.privsep.daemon [-] privsep daemon running as pid 254751#033[00m
Jan 26 13:33:51 np0005596062 nova_compute[227313]: 2026-01-26 18:33:51.948 254751 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 13:33:52 np0005596062 nova_compute[227313]: 2026-01-26 18:33:52.126 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:33:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:53.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:33:53 np0005596062 nova_compute[227313]: 2026-01-26 18:33:53.224 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:53.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:53 np0005596062 nova_compute[227313]: 2026-01-26 18:33:53.829 227317 INFO nova.compute.manager [None req-3568da29-14d4-4434-94f0-41f6d1efa67b ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Get console output#033[00m
Jan 26 13:33:53 np0005596062 nova_compute[227313]: 2026-01-26 18:33:53.837 254751 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 13:33:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:55.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:55.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:33:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:57.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:57 np0005596062 nova_compute[227313]: 2026-01-26 18:33:57.128 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:57.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:58 np0005596062 nova_compute[227313]: 2026-01-26 18:33:58.262 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:33:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:33:59.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:33:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:33:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:33:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:33:59.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:00 np0005596062 nova_compute[227313]: 2026-01-26 18:34:00.360 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Check if temp file /var/lib/nova/instances/tmp3amgnjh_ exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 26 13:34:00 np0005596062 nova_compute[227313]: 2026-01-26 18:34:00.361 227317 DEBUG nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3amgnjh_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='b43c1568-f367-4a8e-beda-27b963ce3769',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 26 13:34:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:00.457 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:34:00 np0005596062 nova_compute[227313]: 2026-01-26 18:34:00.458 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:00.459 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:34:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:01.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:01.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:02 np0005596062 nova_compute[227313]: 2026-01-26 18:34:02.131 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:34:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:03.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:34:03 np0005596062 nova_compute[227313]: 2026-01-26 18:34:03.264 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:03.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:04 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:04Z|00175|binding|INFO|Releasing lport 8e79441f-a5e8-497c-8f8b-e77378840eed from this chassis (sb_readonly=0)
Jan 26 13:34:04 np0005596062 nova_compute[227313]: 2026-01-26 18:34:04.062 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:05.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:05.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:06 np0005596062 nova_compute[227313]: 2026-01-26 18:34:06.708 227317 DEBUG nova.compute.manager [req-2ce1409d-2524-4e56-b1cb-849f2a7a4c20 req-71fa73c2-3207-4da5-b14c-cf4f163df39d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:06 np0005596062 nova_compute[227313]: 2026-01-26 18:34:06.708 227317 DEBUG oslo_concurrency.lockutils [req-2ce1409d-2524-4e56-b1cb-849f2a7a4c20 req-71fa73c2-3207-4da5-b14c-cf4f163df39d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:06 np0005596062 nova_compute[227313]: 2026-01-26 18:34:06.708 227317 DEBUG oslo_concurrency.lockutils [req-2ce1409d-2524-4e56-b1cb-849f2a7a4c20 req-71fa73c2-3207-4da5-b14c-cf4f163df39d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:06 np0005596062 nova_compute[227313]: 2026-01-26 18:34:06.708 227317 DEBUG oslo_concurrency.lockutils [req-2ce1409d-2524-4e56-b1cb-849f2a7a4c20 req-71fa73c2-3207-4da5-b14c-cf4f163df39d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:06 np0005596062 nova_compute[227313]: 2026-01-26 18:34:06.709 227317 DEBUG nova.compute.manager [req-2ce1409d-2524-4e56-b1cb-849f2a7a4c20 req-71fa73c2-3207-4da5-b14c-cf4f163df39d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:06 np0005596062 nova_compute[227313]: 2026-01-26 18:34:06.709 227317 DEBUG nova.compute.manager [req-2ce1409d-2524-4e56-b1cb-849f2a7a4c20 req-71fa73c2-3207-4da5-b14c-cf4f163df39d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.134 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:07.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:07 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:34:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:07.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:07 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:07.460 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.488 227317 INFO nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Took 5.78 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.489 227317 DEBUG nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.507 227317 DEBUG nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3amgnjh_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='b43c1568-f367-4a8e-beda-27b963ce3769',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ef9fca76-d908-4d36-a509-08bbc1445aed),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.510 227317 DEBUG nova.objects.instance [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lazy-loading 'migration_context' on Instance uuid b43c1568-f367-4a8e-beda-27b963ce3769 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.511 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.513 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.513 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.530 227317 DEBUG nova.virt.libvirt.vif [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1719150974',display_name='tempest-TestNetworkAdvancedServerOps-server-1719150974',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1719150974',id=23,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDEOHm6WKCKYdZcYs7kRuLusjif5ojXniiJMhrJrHg7YfxMMa9vKhpHePgKZBFWmUylH+aD0GyBeL4fpDu6rEGxI0F93dA9XsN5JYScbkt/Ge6Oa00kHQ2bRd1K4UHVxMw==',key_name='tempest-TestNetworkAdvancedServerOps-1342297099',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:33:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-51db0pi2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:33:30Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b43c1568-f367-4a8e-beda-27b963ce3769,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.531 227317 DEBUG nova.network.os_vif_util [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Converting VIF {"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.532 227317 DEBUG nova.network.os_vif_util [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.532 227317 DEBUG nova.virt.libvirt.migration [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 13:34:07 np0005596062 nova_compute[227313]:  <mac address="fa:16:3e:78:cc:99"/>
Jan 26 13:34:07 np0005596062 nova_compute[227313]:  <model type="virtio"/>
Jan 26 13:34:07 np0005596062 nova_compute[227313]:  <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:34:07 np0005596062 nova_compute[227313]:  <mtu size="1442"/>
Jan 26 13:34:07 np0005596062 nova_compute[227313]:  <target dev="tap3cc10671-fb"/>
Jan 26 13:34:07 np0005596062 nova_compute[227313]: </interface>
Jan 26 13:34:07 np0005596062 nova_compute[227313]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 26 13:34:07 np0005596062 nova_compute[227313]: 2026-01-26 18:34:07.533 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.016 227317 DEBUG nova.virt.libvirt.migration [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.017 227317 INFO nova.virt.libvirt.migration [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.118 227317 INFO nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.304 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.622 227317 DEBUG nova.virt.libvirt.migration [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.623 227317 DEBUG nova.virt.libvirt.migration [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.759 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452448.758557, b43c1568-f367-4a8e-beda-27b963ce3769 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.760 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.805 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.810 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.836 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.919 227317 DEBUG nova.compute.manager [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.919 227317 DEBUG oslo_concurrency.lockutils [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.919 227317 DEBUG oslo_concurrency.lockutils [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.920 227317 DEBUG oslo_concurrency.lockutils [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.920 227317 DEBUG nova.compute.manager [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.920 227317 WARNING nova.compute.manager [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.921 227317 DEBUG nova.compute.manager [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-changed-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.921 227317 DEBUG nova.compute.manager [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Refreshing instance network info cache due to event network-changed-3cc10671-fb0e-4d3d-9b4e-93636e6c4238. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.921 227317 DEBUG oslo_concurrency.lockutils [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.922 227317 DEBUG oslo_concurrency.lockutils [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.922 227317 DEBUG nova.network.neutron [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Refreshing network info cache for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:34:08 np0005596062 kernel: tap3cc10671-fb (unregistering): left promiscuous mode
Jan 26 13:34:08 np0005596062 NetworkManager[48993]: <info>  [1769452448.9434] device (tap3cc10671-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:34:08 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:08Z|00176|binding|INFO|Releasing lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 from this chassis (sb_readonly=0)
Jan 26 13:34:08 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:08Z|00177|binding|INFO|Setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 down in Southbound
Jan 26 13:34:08 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:08Z|00178|binding|INFO|Removing iface tap3cc10671-fb ovn-installed in OVS
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.956 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:08 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:08.958 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:cc:99 10.100.0.9'], port_security=['fa:16:3e:78:cc:99 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '657115c4-6394-40b2-9bbb-d787eaa44d24'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b43c1568-f367-4a8e-beda-27b963ce3769', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15f2d772-da47-4c77-8357-41c40294bae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '8', 'neutron:security_group_ids': '483e03f0-e8ac-4a81-9798-744393eb00da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bbb333e-4739-4b8f-a299-8418443b07cd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3cc10671-fb0e-4d3d-9b4e-93636e6c4238) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:34:08 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:08.959 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 in datapath 15f2d772-da47-4c77-8357-41c40294bae5 unbound from our chassis#033[00m
Jan 26 13:34:08 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:08.960 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 15f2d772-da47-4c77-8357-41c40294bae5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:34:08 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:08.961 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cf830e52-4c6a-438c-ba66-f07261e4f150]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:08 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:08.962 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5 namespace which is not needed anymore#033[00m
Jan 26 13:34:08 np0005596062 nova_compute[227313]: 2026-01-26 18:34:08.982 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:09 np0005596062 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 26 13:34:09 np0005596062 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 15.240s CPU time.
Jan 26 13:34:09 np0005596062 systemd-machined[195380]: Machine qemu-17-instance-00000017 terminated.
Jan 26 13:34:09 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [NOTICE]   (254630) : haproxy version is 2.8.14-c23fe91
Jan 26 13:34:09 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [NOTICE]   (254630) : path to executable is /usr/sbin/haproxy
Jan 26 13:34:09 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [WARNING]  (254630) : Exiting Master process...
Jan 26 13:34:09 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [WARNING]  (254630) : Exiting Master process...
Jan 26 13:34:09 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [ALERT]    (254630) : Current worker (254632) exited with code 143 (Terminated)
Jan 26 13:34:09 np0005596062 neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5[254626]: [WARNING]  (254630) : All workers exited. Exiting... (0)
Jan 26 13:34:09 np0005596062 systemd[1]: libpod-3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1.scope: Deactivated successfully.
Jan 26 13:34:09 np0005596062 podman[255019]: 2026-01-26 18:34:09.094158924 +0000 UTC m=+0.043824280 container died 3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:34:09 np0005596062 virtqemud[226715]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/b43c1568-f367-4a8e-beda-27b963ce3769_disk: No such file or directory
Jan 26 13:34:09 np0005596062 virtqemud[226715]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/b43c1568-f367-4a8e-beda-27b963ce3769_disk: No such file or directory
Jan 26 13:34:09 np0005596062 kernel: tap3cc10671-fb: entered promiscuous mode
Jan 26 13:34:09 np0005596062 systemd-udevd[255000]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:34:09 np0005596062 NetworkManager[48993]: <info>  [1769452449.1102] manager: (tap3cc10671-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Jan 26 13:34:09 np0005596062 kernel: tap3cc10671-fb (unregistering): left promiscuous mode
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.112 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00179|binding|INFO|Claiming lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for this chassis.
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00180|binding|INFO|3cc10671-fb0e-4d3d-9b4e-93636e6c4238: Claiming fa:16:3e:78:cc:99 10.100.0.9
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.120 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:cc:99 10.100.0.9'], port_security=['fa:16:3e:78:cc:99 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '657115c4-6394-40b2-9bbb-d787eaa44d24'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b43c1568-f367-4a8e-beda-27b963ce3769', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15f2d772-da47-4c77-8357-41c40294bae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '8', 'neutron:security_group_ids': '483e03f0-e8ac-4a81-9798-744393eb00da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bbb333e-4739-4b8f-a299-8418443b07cd, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3cc10671-fb0e-4d3d-9b4e-93636e6c4238) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:34:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:34:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:09.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.140 227317 DEBUG nova.virt.libvirt.guest [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.140 227317 INFO nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migration operation has completed#033[00m
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.141 227317 INFO nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] _post_live_migration() is started..#033[00m
Jan 26 13:34:09 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1-userdata-shm.mount: Deactivated successfully.
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00181|binding|INFO|Setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 ovn-installed in OVS
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00182|binding|INFO|Setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 up in Southbound
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00183|binding|INFO|Releasing lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 from this chassis (sb_readonly=1)
Jan 26 13:34:09 np0005596062 systemd[1]: var-lib-containers-storage-overlay-a4bd6e8a9065bd971a4aa0640968a1ff109a5815285517f8775b948a121827a4-merged.mount: Deactivated successfully.
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00184|if_status|INFO|Dropped 2 log messages in last 1465 seconds (most recently, 1465 seconds ago) due to excessive rate
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00185|if_status|INFO|Not setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 down as sb is readonly
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00186|binding|INFO|Removing iface tap3cc10671-fb ovn-installed in OVS
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.150 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.152 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.153 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00187|binding|INFO|Releasing lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 from this chassis (sb_readonly=0)
Jan 26 13:34:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:09Z|00188|binding|INFO|Setting lport 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 down in Southbound
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.153 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.160 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.162 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:cc:99 10.100.0.9'], port_security=['fa:16:3e:78:cc:99 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '657115c4-6394-40b2-9bbb-d787eaa44d24'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b43c1568-f367-4a8e-beda-27b963ce3769', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15f2d772-da47-4c77-8357-41c40294bae5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '8', 'neutron:security_group_ids': '483e03f0-e8ac-4a81-9798-744393eb00da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bbb333e-4739-4b8f-a299-8418443b07cd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=3cc10671-fb0e-4d3d-9b4e-93636e6c4238) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.183 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.184 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.185 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:09 np0005596062 podman[255019]: 2026-01-26 18:34:09.194507001 +0000 UTC m=+0.144172367 container cleanup 3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 26 13:34:09 np0005596062 systemd[1]: libpod-conmon-3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1.scope: Deactivated successfully.
Jan 26 13:34:09 np0005596062 podman[255053]: 2026-01-26 18:34:09.274989708 +0000 UTC m=+0.061232415 container remove 3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.280 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[976d8c48-b068-44b1-ac50-e313386e7b7f]: (4, ('Mon Jan 26 06:34:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5 (3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1)\n3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1\nMon Jan 26 06:34:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5 (3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1)\n3b2ee5cca3027a73f38a4ab912175230a9c489a414dd4281504babecec9ee0f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.282 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd07a22-2a47-44b2-a7c0-b26e37bb93fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.283 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15f2d772-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.284 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:09 np0005596062 kernel: tap15f2d772-d0: left promiscuous mode
Jan 26 13:34:09 np0005596062 nova_compute[227313]: 2026-01-26 18:34:09.302 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.305 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[08ce585c-1cd3-4df3-a06c-39c3bebcdd15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.320 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c55174-21f2-422a-bc3a-e0360bc55266]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.321 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cd63369f-be45-47c3-8a4b-dfd562d7a29c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.336 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[55520d4f-799f-4084-9814-39a7ce886870]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616962, 'reachable_time': 17131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255073, 'error': None, 'target': 'ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 systemd[1]: run-netns-ovnmeta\x2d15f2d772\x2dda47\x2d4c77\x2d8357\x2d41c40294bae5.mount: Deactivated successfully.
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.340 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-15f2d772-da47-4c77-8357-41c40294bae5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.340 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[21dd1d82-d2b9-4b36-a617-f199cb7304c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.340 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 in datapath 15f2d772-da47-4c77-8357-41c40294bae5 unbound from our chassis#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.341 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 15f2d772-da47-4c77-8357-41c40294bae5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.342 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0791c5-8005-4d01-ba63-2520c01e4d72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.342 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 in datapath 15f2d772-da47-4c77-8357-41c40294bae5 unbound from our chassis#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.343 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 15f2d772-da47-4c77-8357-41c40294bae5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:34:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:09.344 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e47d17-0645-41cc-8010-f7d0049e3af4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:09.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:09 np0005596062 podman[255074]: 2026-01-26 18:34:09.861631866 +0000 UTC m=+0.063393152 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.207 227317 DEBUG nova.compute.manager [req-92146d7e-4892-4ca3-b34a-ef19df2cf536 req-a8c834a9-9e8f-4d59-8ce8-8cce5525b904 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.207 227317 DEBUG oslo_concurrency.lockutils [req-92146d7e-4892-4ca3-b34a-ef19df2cf536 req-a8c834a9-9e8f-4d59-8ce8-8cce5525b904 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.207 227317 DEBUG oslo_concurrency.lockutils [req-92146d7e-4892-4ca3-b34a-ef19df2cf536 req-a8c834a9-9e8f-4d59-8ce8-8cce5525b904 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.207 227317 DEBUG oslo_concurrency.lockutils [req-92146d7e-4892-4ca3-b34a-ef19df2cf536 req-a8c834a9-9e8f-4d59-8ce8-8cce5525b904 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.208 227317 DEBUG nova.compute.manager [req-92146d7e-4892-4ca3-b34a-ef19df2cf536 req-a8c834a9-9e8f-4d59-8ce8-8cce5525b904 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.208 227317 WARNING nova.compute.manager [req-92146d7e-4892-4ca3-b34a-ef19df2cf536 req-a8c834a9-9e8f-4d59-8ce8-8cce5525b904 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.217 227317 DEBUG nova.network.neutron [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Activated binding for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.218 227317 DEBUG nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.219 227317 DEBUG nova.virt.libvirt.vif [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:33:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1719150974',display_name='tempest-TestNetworkAdvancedServerOps-server-1719150974',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1719150974',id=23,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDEOHm6WKCKYdZcYs7kRuLusjif5ojXniiJMhrJrHg7YfxMMa9vKhpHePgKZBFWmUylH+aD0GyBeL4fpDu6rEGxI0F93dA9XsN5JYScbkt/Ge6Oa00kHQ2bRd1K4UHVxMw==',key_name='tempest-TestNetworkAdvancedServerOps-1342297099',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:33:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-51db0pi2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:33:56Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b43c1568-f367-4a8e-beda-27b963ce3769,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.219 227317 DEBUG nova.network.os_vif_util [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Converting VIF {"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.219 227317 DEBUG nova.network.os_vif_util [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.220 227317 DEBUG os_vif [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.222 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.222 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cc10671-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.224 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.225 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.228 227317 INFO os_vif [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:cc:99,bridge_name='br-int',has_traffic_filtering=True,id=3cc10671-fb0e-4d3d-9b4e-93636e6c4238,network=Network(15f2d772-da47-4c77-8357-41c40294bae5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc10671-fb')#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.228 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.229 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.229 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.229 227317 DEBUG nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.229 227317 INFO nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Deleting instance files /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769_del#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.230 227317 INFO nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Deletion of /var/lib/nova/instances/b43c1568-f367-4a8e-beda-27b963ce3769_del complete#033[00m
Jan 26 13:34:10 np0005596062 nova_compute[227313]: 2026-01-26 18:34:10.549 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.055 227317 DEBUG nova.compute.manager [req-e03c6188-c902-40b5-b50b-a80d1d0efcac req-f29b341b-37bd-49aa-a909-a21c946ee3b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.056 227317 DEBUG oslo_concurrency.lockutils [req-e03c6188-c902-40b5-b50b-a80d1d0efcac req-f29b341b-37bd-49aa-a909-a21c946ee3b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.056 227317 DEBUG oslo_concurrency.lockutils [req-e03c6188-c902-40b5-b50b-a80d1d0efcac req-f29b341b-37bd-49aa-a909-a21c946ee3b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.056 227317 DEBUG oslo_concurrency.lockutils [req-e03c6188-c902-40b5-b50b-a80d1d0efcac req-f29b341b-37bd-49aa-a909-a21c946ee3b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.057 227317 DEBUG nova.compute.manager [req-e03c6188-c902-40b5-b50b-a80d1d0efcac req-f29b341b-37bd-49aa-a909-a21c946ee3b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.057 227317 DEBUG nova.compute.manager [req-e03c6188-c902-40b5-b50b-a80d1d0efcac req-f29b341b-37bd-49aa-a909-a21c946ee3b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:34:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:11.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.207 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.394 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:34:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:11.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.491 227317 DEBUG nova.network.neutron [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updated VIF entry in instance network info cache for port 3cc10671-fb0e-4d3d-9b4e-93636e6c4238. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.491 227317 DEBUG nova.network.neutron [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updating instance_info_cache with network_info: [{"id": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "address": "fa:16:3e:78:cc:99", "network": {"id": "15f2d772-da47-4c77-8357-41c40294bae5", "bridge": "br-int", "label": "tempest-network-smoke--1281266081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc10671-fb", "ovs_interfaceid": "3cc10671-fb0e-4d3d-9b4e-93636e6c4238", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:34:11 np0005596062 nova_compute[227313]: 2026-01-26 18:34:11.594 227317 DEBUG oslo_concurrency.lockutils [req-66a37bd8-97b1-4f34-ae22-341404f20025 req-1759d214-02c1-40b6-bc71-10e1c5ce53c0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-b43c1568-f367-4a8e-beda-27b963ce3769" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:34:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.136 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.220 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.221 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.221 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.221 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.221 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.413 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.414 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.414 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.415 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.415 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.415 227317 WARNING nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.415 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.416 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.416 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.416 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.416 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.416 227317 WARNING nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.416 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.417 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.417 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.417 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.417 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.417 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-unplugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.418 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.418 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.418 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.418 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.418 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.419 227317 WARNING nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.419 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.419 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.419 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.419 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.419 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.420 227317 WARNING nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.420 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.420 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.420 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.420 227317 DEBUG oslo_concurrency.lockutils [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.421 227317 DEBUG nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] No waiting events found dispatching network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.421 227317 WARNING nova.compute.manager [req-8860defc-d260-4002-94c8-341025dc3201 req-5986093a-7663-4e32-8914-9d1e5ed518b8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Received unexpected event network-vif-plugged-3cc10671-fb0e-4d3d-9b4e-93636e6c4238 for instance with vm_state active and task_state migrating.#033[00m
Jan 26 13:34:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:34:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4022270513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.679 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.832 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.834 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4650MB free_disk=20.942729949951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.834 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.835 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.923 227317 INFO nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Updating resource usage from migration ef9fca76-d908-4d36-a509-08bbc1445aed#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.968 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Migration ef9fca76-d908-4d36-a509-08bbc1445aed is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.969 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:34:12 np0005596062 nova_compute[227313]: 2026-01-26 18:34:12.969 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:34:13 np0005596062 nova_compute[227313]: 2026-01-26 18:34:13.024 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:13.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:34:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/962253873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:34:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:13 np0005596062 nova_compute[227313]: 2026-01-26 18:34:13.466 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:13 np0005596062 nova_compute[227313]: 2026-01-26 18:34:13.471 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:34:13 np0005596062 nova_compute[227313]: 2026-01-26 18:34:13.490 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:34:13 np0005596062 nova_compute[227313]: 2026-01-26 18:34:13.514 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:34:13 np0005596062 nova_compute[227313]: 2026-01-26 18:34:13.514 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:15.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:15 np0005596062 nova_compute[227313]: 2026-01-26 18:34:15.224 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:15.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:17 np0005596062 nova_compute[227313]: 2026-01-26 18:34:17.137 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:17.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:17.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.110 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Acquiring lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.110 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.110 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "b43c1568-f367-4a8e-beda-27b963ce3769-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:19.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.159 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.160 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.160 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.160 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.160 227317 DEBUG oslo_concurrency.processutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:34:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:19.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.514 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:34:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/343784277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.607 227317 DEBUG oslo_concurrency.processutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.627 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.755 227317 WARNING nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.756 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4662MB free_disk=20.942726135253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.756 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:19 np0005596062 nova_compute[227313]: 2026-01-26 18:34:19.756 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:20 np0005596062 nova_compute[227313]: 2026-01-26 18:34:20.019 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Migration for instance b43c1568-f367-4a8e-beda-27b963ce3769 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 26 13:34:20 np0005596062 nova_compute[227313]: 2026-01-26 18:34:20.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:20 np0005596062 nova_compute[227313]: 2026-01-26 18:34:20.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:20 np0005596062 nova_compute[227313]: 2026-01-26 18:34:20.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:34:20 np0005596062 nova_compute[227313]: 2026-01-26 18:34:20.227 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:20 np0005596062 nova_compute[227313]: 2026-01-26 18:34:20.427 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 26 13:34:20 np0005596062 podman[255218]: 2026-01-26 18:34:20.888365171 +0000 UTC m=+0.104827087 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:34:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:21.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:21.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.548 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.647 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Migration ef9fca76-d908-4d36-a509-08bbc1445aed is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.647 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:34:21 np0005596062 nova_compute[227313]: 2026-01-26 18:34:21.647 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:34:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:22 np0005596062 nova_compute[227313]: 2026-01-26 18:34:22.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:22 np0005596062 nova_compute[227313]: 2026-01-26 18:34:22.139 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:22 np0005596062 nova_compute[227313]: 2026-01-26 18:34:22.711 227317 DEBUG oslo_concurrency.processutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:34:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1167864172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.128 227317 DEBUG oslo_concurrency.processutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.136 227317 DEBUG nova.compute.provider_tree [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:34:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:23.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.161 227317 DEBUG nova.scheduler.client.report [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:34:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:34:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:23.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.601 227317 DEBUG nova.compute.resource_tracker [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.602 227317 DEBUG oslo_concurrency.lockutils [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:23 np0005596062 nova_compute[227313]: 2026-01-26 18:34:23.608 227317 INFO nova.compute.manager [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Jan 26 13:34:24 np0005596062 nova_compute[227313]: 2026-01-26 18:34:24.138 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452449.1379502, b43c1568-f367-4a8e-beda-27b963ce3769 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:34:24 np0005596062 nova_compute[227313]: 2026-01-26 18:34:24.139 227317 INFO nova.compute.manager [-] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:34:24 np0005596062 nova_compute[227313]: 2026-01-26 18:34:24.291 227317 DEBUG nova.compute.manager [None req-cbcff4b1-70ad-4165-8128-8ef548c3bf80 - - - - - -] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:34:24 np0005596062 nova_compute[227313]: 2026-01-26 18:34:24.485 227317 INFO nova.scheduler.client.report [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] Deleted allocation for migration ef9fca76-d908-4d36-a509-08bbc1445aed#033[00m
Jan 26 13:34:24 np0005596062 nova_compute[227313]: 2026-01-26 18:34:24.487 227317 DEBUG nova.virt.libvirt.driver [None req-728c6cfa-4c40-446d-a42e-5cc01ad7ac4c 928382b4417c43a2b3bcffe23565d8c4 ac06757577d849f28b5779fb516f263a - - default default] [instance: b43c1568-f367-4a8e-beda-27b963ce3769] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 26 13:34:25 np0005596062 nova_compute[227313]: 2026-01-26 18:34:25.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:25.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:25 np0005596062 nova_compute[227313]: 2026-01-26 18:34:25.230 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:25.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:26 np0005596062 nova_compute[227313]: 2026-01-26 18:34:26.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:34:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:27 np0005596062 nova_compute[227313]: 2026-01-26 18:34:27.141 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:27.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:27.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:29.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:29.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:30 np0005596062 nova_compute[227313]: 2026-01-26 18:34:30.231 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:30 np0005596062 nova_compute[227313]: 2026-01-26 18:34:30.824 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:30 np0005596062 nova_compute[227313]: 2026-01-26 18:34:30.825 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:30 np0005596062 nova_compute[227313]: 2026-01-26 18:34:30.869 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.129 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.129 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.135 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.136 227317 INFO nova.compute.claims [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:34:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:31.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.460 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:31.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:34:31 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/978074995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.915 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.920 227317 DEBUG nova.compute.provider_tree [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:34:31 np0005596062 nova_compute[227313]: 2026-01-26 18:34:31.952 227317 DEBUG nova.scheduler.client.report [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.108 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.109 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.143 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.523 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.524 227317 DEBUG nova.network.neutron [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.618 227317 INFO nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.669 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.863 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.865 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.865 227317 INFO nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Creating image(s)#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.892 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.917 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.943 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.947 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:32 np0005596062 nova_compute[227313]: 2026-01-26 18:34:32.972 227317 DEBUG nova.policy [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95ada1688fc843cb979bd6c75b517e4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d8fc25e3f054e988a715ec90a59c8d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.022 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.022 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.023 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.023 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.049 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.052 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 f80d1b91-bc87-418e-aa99-016d72cd668f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:33.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.475 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 f80d1b91-bc87-418e-aa99-016d72cd668f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:33.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:33 np0005596062 nova_compute[227313]: 2026-01-26 18:34:33.539 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] resizing rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:34:34 np0005596062 nova_compute[227313]: 2026-01-26 18:34:34.256 227317 DEBUG nova.objects.instance [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lazy-loading 'migration_context' on Instance uuid f80d1b91-bc87-418e-aa99-016d72cd668f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:34:34 np0005596062 nova_compute[227313]: 2026-01-26 18:34:34.291 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:34:34 np0005596062 nova_compute[227313]: 2026-01-26 18:34:34.292 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Ensure instance console log exists: /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:34:34 np0005596062 nova_compute[227313]: 2026-01-26 18:34:34.292 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:34 np0005596062 nova_compute[227313]: 2026-01-26 18:34:34.292 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:34 np0005596062 nova_compute[227313]: 2026-01-26 18:34:34.293 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:35.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:35 np0005596062 nova_compute[227313]: 2026-01-26 18:34:35.232 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:35.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:36 np0005596062 nova_compute[227313]: 2026-01-26 18:34:36.809 227317 DEBUG nova.network.neutron [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Successfully created port: 8320c871-95fc-4422-a1de-f44442d03880 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:34:37 np0005596062 nova_compute[227313]: 2026-01-26 18:34:37.145 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:37.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:37.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:39.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:39.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:39.862 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:34:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:39.863 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:34:39 np0005596062 nova_compute[227313]: 2026-01-26 18:34:39.864 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.159 227317 DEBUG nova.network.neutron [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Successfully updated port: 8320c871-95fc-4422-a1de-f44442d03880 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.184 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.184 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquired lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.184 227317 DEBUG nova.network.neutron [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.234 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.382 227317 DEBUG nova.compute.manager [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-changed-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.382 227317 DEBUG nova.compute.manager [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Refreshing instance network info cache due to event network-changed-8320c871-95fc-4422-a1de-f44442d03880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.383 227317 DEBUG oslo_concurrency.lockutils [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:34:40 np0005596062 nova_compute[227313]: 2026-01-26 18:34:40.486 227317 DEBUG nova.network.neutron [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:34:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:34:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4199676052' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:34:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:34:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4199676052' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:34:40 np0005596062 podman[255515]: 2026-01-26 18:34:40.869714677 +0000 UTC m=+0.068411156 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 13:34:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:41.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:41.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.147 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.323 227317 DEBUG nova.network.neutron [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updating instance_info_cache with network_info: [{"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.487 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Releasing lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.487 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Instance network_info: |[{"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.488 227317 DEBUG oslo_concurrency.lockutils [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.488 227317 DEBUG nova.network.neutron [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Refreshing network info cache for port 8320c871-95fc-4422-a1de-f44442d03880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.491 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Start _get_guest_xml network_info=[{"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.496 227317 WARNING nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.503 227317 DEBUG nova.virt.libvirt.host [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.504 227317 DEBUG nova.virt.libvirt.host [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.507 227317 DEBUG nova.virt.libvirt.host [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.508 227317 DEBUG nova.virt.libvirt.host [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.509 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.509 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.509 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.509 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.510 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.510 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.510 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.510 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.511 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.511 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.511 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.511 227317 DEBUG nova.virt.hardware [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.513 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:34:42 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3263032913' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:34:42 np0005596062 nova_compute[227313]: 2026-01-26 18:34:42.975 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.000 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.004 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:43.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:34:43 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2554229135' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.462 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.464 227317 DEBUG nova.virt.libvirt.vif [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:34:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-62938558-acce',id=24,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLJiWXodl7Hxj/wweYnSnHc0FwF/SxnekNPyxp62l3q40shVsfGDjmtDdVKQ0kP9OsoTDawgSo4Uj+q+UwvV4zDAKvLswhlqAYblh2u6w9Nv3LR2wklf01sjr7GPx4j6gg==',key_name='tempest-TestSecurityGroupsBasicOps-1930721353',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d8fc25e3f054e988a715ec90a59c8d0',ramdisk_id='',reservation_id='r-6xlv7w50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-62938558',owner_user_name='tempest-TestSecurityGroupsBasicOps-62938558-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:34:32Z,user_data=None,user_id='95ada1688fc843cb979bd6c75b517e4a',uuid=f80d1b91-bc87-418e-aa99-016d72cd668f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.464 227317 DEBUG nova.network.os_vif_util [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Converting VIF {"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.465 227317 DEBUG nova.network.os_vif_util [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.466 227317 DEBUG nova.objects.instance [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid f80d1b91-bc87-418e-aa99-016d72cd668f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.494 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <uuid>f80d1b91-bc87-418e-aa99-016d72cd668f</uuid>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <name>instance-00000018</name>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959</nova:name>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:34:42</nova:creationTime>
Jan 26 13:34:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:user uuid="95ada1688fc843cb979bd6c75b517e4a">tempest-TestSecurityGroupsBasicOps-62938558-project-member</nova:user>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:project uuid="8d8fc25e3f054e988a715ec90a59c8d0">tempest-TestSecurityGroupsBasicOps-62938558</nova:project>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <nova:port uuid="8320c871-95fc-4422-a1de-f44442d03880">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <entry name="serial">f80d1b91-bc87-418e-aa99-016d72cd668f</entry>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <entry name="uuid">f80d1b91-bc87-418e-aa99-016d72cd668f</entry>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:34:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/f80d1b91-bc87-418e-aa99-016d72cd668f_disk">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:34:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:43.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/f80d1b91-bc87-418e-aa99-016d72cd668f_disk.config">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:c6:40:c3"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <target dev="tap8320c871-95"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/console.log" append="off"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:34:43 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:34:43 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:34:43 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:34:43 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.495 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Preparing to wait for external event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.495 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.496 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.496 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.496 227317 DEBUG nova.virt.libvirt.vif [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:34:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-62938558-acce',id=24,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLJiWXodl7Hxj/wweYnSnHc0FwF/SxnekNPyxp62l3q40shVsfGDjmtDdVKQ0kP9OsoTDawgSo4Uj+q+UwvV4zDAKvLswhlqAYblh2u6w9Nv3LR2wklf01sjr7GPx4j6gg==',key_name='tempest-TestSecurityGroupsBasicOps-1930721353',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d8fc25e3f054e988a715ec90a59c8d0',ramdisk_id='',reservation_id='r-6xlv7w50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-62938558',owner_user_name='tempest-TestSecurityGroupsBasicOps-62938558-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:34:32Z,user_data=None,user_id='95ada1688fc843cb979bd6c75b517e4a',uuid=f80d1b91-bc87-418e-aa99-016d72cd668f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.497 227317 DEBUG nova.network.os_vif_util [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Converting VIF {"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.497 227317 DEBUG nova.network.os_vif_util [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.497 227317 DEBUG os_vif [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.498 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.498 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.499 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.501 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.501 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8320c871-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.502 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8320c871-95, col_values=(('external_ids', {'iface-id': '8320c871-95fc-4422-a1de-f44442d03880', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:40:c3', 'vm-uuid': 'f80d1b91-bc87-418e-aa99-016d72cd668f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.503 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:43 np0005596062 NetworkManager[48993]: <info>  [1769452483.5040] manager: (tap8320c871-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.506 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.507 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.509 227317 INFO os_vif [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95')#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.713 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.713 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.713 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] No VIF found with MAC fa:16:3e:c6:40:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.714 227317 INFO nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Using config drive#033[00m
Jan 26 13:34:43 np0005596062 nova_compute[227313]: 2026-01-26 18:34:43.734 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:43.864 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.311 227317 INFO nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Creating config drive at /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/disk.config#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.320 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9y0zc0zz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.451 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9y0zc0zz" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.478 227317 DEBUG nova.storage.rbd_utils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] rbd image f80d1b91-bc87-418e-aa99-016d72cd668f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.481 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/disk.config f80d1b91-bc87-418e-aa99-016d72cd668f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.663 227317 DEBUG oslo_concurrency.processutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/disk.config f80d1b91-bc87-418e-aa99-016d72cd668f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.664 227317 INFO nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Deleting local config drive /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f/disk.config because it was imported into RBD.#033[00m
Jan 26 13:34:44 np0005596062 kernel: tap8320c871-95: entered promiscuous mode
Jan 26 13:34:44 np0005596062 NetworkManager[48993]: <info>  [1769452484.7137] manager: (tap8320c871-95): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 26 13:34:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:44Z|00189|binding|INFO|Claiming lport 8320c871-95fc-4422-a1de-f44442d03880 for this chassis.
Jan 26 13:34:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:44Z|00190|binding|INFO|8320c871-95fc-4422-a1de-f44442d03880: Claiming fa:16:3e:c6:40:c3 10.100.0.9
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.714 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.728 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:40:c3 10.100.0.9'], port_security=['fa:16:3e:c6:40:c3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f80d1b91-bc87-418e-aa99-016d72cd668f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d8fc25e3f054e988a715ec90a59c8d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1c7a93f7-da8c-4aec-a2ab-f853885c84e4 fcb5eda1-9949-4ddf-981f-5bf7d419bddd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62ca4ea7-3ca5-45be-aa51-7c1f07dca09c, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=8320c871-95fc-4422-a1de-f44442d03880) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.730 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 8320c871-95fc-4422-a1de-f44442d03880 in datapath aca06a26-1ace-452c-b833-7d0f7b878fe7 bound to our chassis#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.731 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aca06a26-1ace-452c-b833-7d0f7b878fe7#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.741 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ef130323-12f4-42ac-ab35-d55d805e1e3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.742 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaca06a26-11 in ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.744 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaca06a26-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.744 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cf94d9f6-ac65-40f8-b5ac-c53d3f0b87dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.745 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[cd2480a5-40ed-4d32-b234-08a6a2922a3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 systemd-udevd[255672]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:34:44 np0005596062 systemd-machined[195380]: New machine qemu-18-instance-00000018.
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.757 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[24046cf3-6097-49b5-87d4-395e1779185c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 NetworkManager[48993]: <info>  [1769452484.7609] device (tap8320c871-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:34:44 np0005596062 NetworkManager[48993]: <info>  [1769452484.7621] device (tap8320c871-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:34:44 np0005596062 systemd[1]: Started Virtual Machine qemu-18-instance-00000018.
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.777 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.780 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe761cb-2e99-4c90-8d0f-fa892c7a4cd4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:44Z|00191|binding|INFO|Setting lport 8320c871-95fc-4422-a1de-f44442d03880 ovn-installed in OVS
Jan 26 13:34:44 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:44Z|00192|binding|INFO|Setting lport 8320c871-95fc-4422-a1de-f44442d03880 up in Southbound
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.785 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.817 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d97e85-894c-4e4c-9847-2795d44c0f79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 systemd-udevd[255676]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.823 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[54dc5497-6e3d-4896-8c57-1bf32b59a5c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 NetworkManager[48993]: <info>  [1769452484.8250] manager: (tapaca06a26-10): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.857 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[2e04d8cf-21c1-49ac-ac26-e1dffeffa101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.860 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2d0c38-f506-431b-bfdf-383d0d2cd5d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 NetworkManager[48993]: <info>  [1769452484.8810] device (tapaca06a26-10): carrier: link connected
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.885 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[06d44dc7-1d99-4cd5-995f-a3d6ae64df89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.900 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[349bab07-ef52-4b8a-981e-62da69f1f9b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaca06a26-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:0c:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624493, 'reachable_time': 41219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255705, 'error': None, 'target': 'ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.915 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[91ccb38e-cc7f-4a83-9616-f00e73628048]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:cbf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624493, 'tstamp': 624493}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255706, 'error': None, 'target': 'ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.933 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc689e1-081e-4425-963c-06b12e57828b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaca06a26-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:0c:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624493, 'reachable_time': 41219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255707, 'error': None, 'target': 'ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:44.960 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ac61ffa0-e56f-4aac-af5e-8a8c7ab9d0f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.979 227317 DEBUG nova.network.neutron [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updated VIF entry in instance network info cache for port 8320c871-95fc-4422-a1de-f44442d03880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:34:44 np0005596062 nova_compute[227313]: 2026-01-26 18:34:44.980 227317 DEBUG nova.network.neutron [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updating instance_info_cache with network_info: [{"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.019 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb5d668-f362-41ed-9651-a9348feac813]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.020 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaca06a26-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.020 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.021 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaca06a26-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.065 227317 DEBUG oslo_concurrency.lockutils [req-f30ef80b-671d-4a5d-80f7-e2bb04ebedba req-23dbbc45-3296-4839-b285-2fc3cb95b30c 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:34:45 np0005596062 kernel: tapaca06a26-10: entered promiscuous mode
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.067 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.071 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:45 np0005596062 NetworkManager[48993]: <info>  [1769452485.0717] manager: (tapaca06a26-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.073 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaca06a26-10, col_values=(('external_ids', {'iface-id': '3dfd9042-2c6f-41ab-8a97-cbd186163aae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.074 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:45Z|00193|binding|INFO|Releasing lport 3dfd9042-2c6f-41ab-8a97-cbd186163aae from this chassis (sb_readonly=0)
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.075 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.077 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aca06a26-1ace-452c-b833-7d0f7b878fe7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aca06a26-1ace-452c-b833-7d0f7b878fe7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.078 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[665d2553-cf44-4bd4-8ca9-d41c6f6a02c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.078 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-aca06a26-1ace-452c-b833-7d0f7b878fe7
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/aca06a26-1ace-452c-b833-7d0f7b878fe7.pid.haproxy
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID aca06a26-1ace-452c-b833-7d0f7b878fe7
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:34:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:34:45.080 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'env', 'PROCESS_TAG=haproxy-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aca06a26-1ace-452c-b833-7d0f7b878fe7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.089 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:45.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.273 227317 DEBUG nova.compute.manager [req-bbaf037c-0e5a-468e-a055-f4a04195a140 req-f3364495-027e-407e-9050-73608deba79a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.274 227317 DEBUG oslo_concurrency.lockutils [req-bbaf037c-0e5a-468e-a055-f4a04195a140 req-f3364495-027e-407e-9050-73608deba79a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.275 227317 DEBUG oslo_concurrency.lockutils [req-bbaf037c-0e5a-468e-a055-f4a04195a140 req-f3364495-027e-407e-9050-73608deba79a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.275 227317 DEBUG oslo_concurrency.lockutils [req-bbaf037c-0e5a-468e-a055-f4a04195a140 req-f3364495-027e-407e-9050-73608deba79a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.275 227317 DEBUG nova.compute.manager [req-bbaf037c-0e5a-468e-a055-f4a04195a140 req-f3364495-027e-407e-9050-73608deba79a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Processing event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:34:45 np0005596062 podman[255739]: 2026-01-26 18:34:45.417630727 +0000 UTC m=+0.047598811 container create b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:34:45 np0005596062 systemd[1]: Started libpod-conmon-b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847.scope.
Jan 26 13:34:45 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:34:45 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e90b7756508072f255cd0b0b1fa8ce46b713bd323648c863865fe0de151f7d85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:34:45 np0005596062 podman[255739]: 2026-01-26 18:34:45.391238943 +0000 UTC m=+0.021207077 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:34:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:45.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:45 np0005596062 podman[255739]: 2026-01-26 18:34:45.498918835 +0000 UTC m=+0.128886949 container init b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 13:34:45 np0005596062 podman[255739]: 2026-01-26 18:34:45.504003901 +0000 UTC m=+0.133971985 container start b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 13:34:45 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [NOTICE]   (255759) : New worker (255761) forked
Jan 26 13:34:45 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [NOTICE]   (255759) : Loading success.
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.805 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.807 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452485.8043137, f80d1b91-bc87-418e-aa99-016d72cd668f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.808 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] VM Started (Lifecycle Event)#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.811 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.816 227317 INFO nova.virt.libvirt.driver [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Instance spawned successfully.#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.817 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.833 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.839 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.852 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.853 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.855 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.856 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.857 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.858 227317 DEBUG nova.virt.libvirt.driver [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.885 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.886 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452485.8055875, f80d1b91-bc87-418e-aa99-016d72cd668f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.886 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.924 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.929 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452485.8110235, f80d1b91-bc87-418e-aa99-016d72cd668f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.929 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.943 227317 INFO nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Took 13.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.944 227317 DEBUG nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.962 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:34:45 np0005596062 nova_compute[227313]: 2026-01-26 18:34:45.966 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:34:46 np0005596062 nova_compute[227313]: 2026-01-26 18:34:46.022 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:34:46 np0005596062 nova_compute[227313]: 2026-01-26 18:34:46.066 227317 INFO nova.compute.manager [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Took 15.02 seconds to build instance.#033[00m
Jan 26 13:34:46 np0005596062 nova_compute[227313]: 2026-01-26 18:34:46.111 227317 DEBUG oslo_concurrency.lockutils [None req-73157733-6a9a-41c6-87f6-6c71fc418555 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.150 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:47.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.373 227317 DEBUG nova.compute.manager [req-c2827b8c-859a-4c5a-b59d-2e6fcd602da8 req-ab510844-d814-4c8b-b95b-c8eca7781942 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.374 227317 DEBUG oslo_concurrency.lockutils [req-c2827b8c-859a-4c5a-b59d-2e6fcd602da8 req-ab510844-d814-4c8b-b95b-c8eca7781942 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.375 227317 DEBUG oslo_concurrency.lockutils [req-c2827b8c-859a-4c5a-b59d-2e6fcd602da8 req-ab510844-d814-4c8b-b95b-c8eca7781942 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.376 227317 DEBUG oslo_concurrency.lockutils [req-c2827b8c-859a-4c5a-b59d-2e6fcd602da8 req-ab510844-d814-4c8b-b95b-c8eca7781942 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.376 227317 DEBUG nova.compute.manager [req-c2827b8c-859a-4c5a-b59d-2e6fcd602da8 req-ab510844-d814-4c8b-b95b-c8eca7781942 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] No waiting events found dispatching network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:34:47 np0005596062 nova_compute[227313]: 2026-01-26 18:34:47.377 227317 WARNING nova.compute.manager [req-c2827b8c-859a-4c5a-b59d-2e6fcd602da8 req-ab510844-d814-4c8b-b95b-c8eca7781942 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received unexpected event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:34:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:47.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:48 np0005596062 nova_compute[227313]: 2026-01-26 18:34:48.504 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:49.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:50 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:50Z|00194|binding|INFO|Releasing lport 3dfd9042-2c6f-41ab-8a97-cbd186163aae from this chassis (sb_readonly=0)
Jan 26 13:34:50 np0005596062 NetworkManager[48993]: <info>  [1769452490.6893] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 26 13:34:50 np0005596062 NetworkManager[48993]: <info>  [1769452490.6902] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 26 13:34:50 np0005596062 nova_compute[227313]: 2026-01-26 18:34:50.692 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:50 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:50Z|00195|binding|INFO|Releasing lport 3dfd9042-2c6f-41ab-8a97-cbd186163aae from this chassis (sb_readonly=0)
Jan 26 13:34:50 np0005596062 nova_compute[227313]: 2026-01-26 18:34:50.714 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:50 np0005596062 nova_compute[227313]: 2026-01-26 18:34:50.719 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:50 np0005596062 nova_compute[227313]: 2026-01-26 18:34:50.838 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:51.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:51 np0005596062 nova_compute[227313]: 2026-01-26 18:34:51.402 227317 DEBUG nova.compute.manager [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-changed-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:34:51 np0005596062 nova_compute[227313]: 2026-01-26 18:34:51.402 227317 DEBUG nova.compute.manager [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Refreshing instance network info cache due to event network-changed-8320c871-95fc-4422-a1de-f44442d03880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:34:51 np0005596062 nova_compute[227313]: 2026-01-26 18:34:51.402 227317 DEBUG oslo_concurrency.lockutils [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:34:51 np0005596062 nova_compute[227313]: 2026-01-26 18:34:51.403 227317 DEBUG oslo_concurrency.lockutils [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:34:51 np0005596062 nova_compute[227313]: 2026-01-26 18:34:51.403 227317 DEBUG nova.network.neutron [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Refreshing network info cache for port 8320c871-95fc-4422-a1de-f44442d03880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:34:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:51.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:51 np0005596062 podman[255816]: 2026-01-26 18:34:51.86599259 +0000 UTC m=+0.080109218 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:34:52 np0005596062 nova_compute[227313]: 2026-01-26 18:34:52.153 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:53.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:53.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:53 np0005596062 nova_compute[227313]: 2026-01-26 18:34:53.506 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:54 np0005596062 nova_compute[227313]: 2026-01-26 18:34:54.661 227317 DEBUG nova.network.neutron [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updated VIF entry in instance network info cache for port 8320c871-95fc-4422-a1de-f44442d03880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:34:54 np0005596062 nova_compute[227313]: 2026-01-26 18:34:54.662 227317 DEBUG nova.network.neutron [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updating instance_info_cache with network_info: [{"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:34:54 np0005596062 nova_compute[227313]: 2026-01-26 18:34:54.688 227317 DEBUG oslo_concurrency.lockutils [req-a12497c6-ea29-4161-9bb6-7a805ebbf2b1 req-2eb820ba-a65d-4a1d-95ff-8813f4b18af1 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:34:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:55.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:55.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:34:57 np0005596062 nova_compute[227313]: 2026-01-26 18:34:57.155 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:57.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:34:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:57.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:58 np0005596062 nova_compute[227313]: 2026-01-26 18:34:58.509 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:59 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:59Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:40:c3 10.100.0.9
Jan 26 13:34:59 np0005596062 ovn_controller[133984]: 2026-01-26T18:34:59Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:40:c3 10.100.0.9
Jan 26 13:34:59 np0005596062 nova_compute[227313]: 2026-01-26 18:34:59.059 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:34:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:34:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:34:59.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:34:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:34:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:34:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:34:59.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:01.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:01.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:02 np0005596062 nova_compute[227313]: 2026-01-26 18:35:02.154 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:03.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:03 np0005596062 nova_compute[227313]: 2026-01-26 18:35:03.512 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:03.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:05.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:05.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:07 np0005596062 nova_compute[227313]: 2026-01-26 18:35:07.156 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:07.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:07.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:35:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:35:08 np0005596062 nova_compute[227313]: 2026-01-26 18:35:08.515 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:09.185 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:09.186 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:09.186 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:09.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:09.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:11.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:11 np0005596062 podman[256152]: 2026-01-26 18:35:11.870166377 +0000 UTC m=+0.074621891 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 13:35:12 np0005596062 nova_compute[227313]: 2026-01-26 18:35:12.158 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.189 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.189 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.189 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.190 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.190 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:35:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:13.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.517 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:13.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:35:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1294624274' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:35:13 np0005596062 nova_compute[227313]: 2026-01-26 18:35:13.670 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:35:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:15.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:15.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:35:16 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:35:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:16 np0005596062 nova_compute[227313]: 2026-01-26 18:35:16.928 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:35:16 np0005596062 nova_compute[227313]: 2026-01-26 18:35:16.929 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.092 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.093 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4523MB free_disk=20.942752838134766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.093 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.094 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.160 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:17.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.384 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance f80d1b91-bc87-418e-aa99-016d72cd668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.385 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:35:17 np0005596062 nova_compute[227313]: 2026-01-26 18:35:17.385 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:35:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:17.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:18 np0005596062 nova_compute[227313]: 2026-01-26 18:35:18.519 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:19.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.194 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:20.194 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:35:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:20.195 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.471 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:35:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:35:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3518612531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.899 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.905 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.948 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.991 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:35:20 np0005596062 nova_compute[227313]: 2026-01-26 18:35:20.992 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:21.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:22 np0005596062 nova_compute[227313]: 2026-01-26 18:35:22.162 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:22 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:22.197 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:35:22 np0005596062 podman[256322]: 2026-01-26 18:35:22.889913686 +0000 UTC m=+0.096974598 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:35:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:23.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:23 np0005596062 nova_compute[227313]: 2026-01-26 18:35:23.521 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:23.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.026 227317 DEBUG nova.compute.manager [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-changed-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.027 227317 DEBUG nova.compute.manager [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Refreshing instance network info cache due to event network-changed-8320c871-95fc-4422-a1de-f44442d03880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.027 227317 DEBUG oslo_concurrency.lockutils [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.027 227317 DEBUG oslo_concurrency.lockutils [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.027 227317 DEBUG nova.network.neutron [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Refreshing network info cache for port 8320c871-95fc-4422-a1de-f44442d03880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.177 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.177 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.178 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.178 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.178 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.179 227317 INFO nova.compute.manager [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Terminating instance#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.180 227317 DEBUG nova.compute.manager [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:35:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:25.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:25 np0005596062 kernel: tap8320c871-95 (unregistering): left promiscuous mode
Jan 26 13:35:25 np0005596062 NetworkManager[48993]: <info>  [1769452525.3762] device (tap8320c871-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.385 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00196|binding|INFO|Releasing lport 8320c871-95fc-4422-a1de-f44442d03880 from this chassis (sb_readonly=0)
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00197|binding|INFO|Setting lport 8320c871-95fc-4422-a1de-f44442d03880 down in Southbound
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00198|binding|INFO|Removing iface tap8320c871-95 ovn-installed in OVS
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.399 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:40:c3 10.100.0.9'], port_security=['fa:16:3e:c6:40:c3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f80d1b91-bc87-418e-aa99-016d72cd668f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d8fc25e3f054e988a715ec90a59c8d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c7a93f7-da8c-4aec-a2ab-f853885c84e4 fcb5eda1-9949-4ddf-981f-5bf7d419bddd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62ca4ea7-3ca5-45be-aa51-7c1f07dca09c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=8320c871-95fc-4422-a1de-f44442d03880) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.400 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 8320c871-95fc-4422-a1de-f44442d03880 in datapath aca06a26-1ace-452c-b833-7d0f7b878fe7 unbound from our chassis#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.401 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aca06a26-1ace-452c-b833-7d0f7b878fe7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.402 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfc6ec0-0120-4c9c-af89-9448f05d11e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.403 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7 namespace which is not needed anymore#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.404 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 26 13:35:25 np0005596062 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Consumed 15.414s CPU time.
Jan 26 13:35:25 np0005596062 systemd-machined[195380]: Machine qemu-18-instance-00000018 terminated.
Jan 26 13:35:25 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [NOTICE]   (255759) : haproxy version is 2.8.14-c23fe91
Jan 26 13:35:25 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [NOTICE]   (255759) : path to executable is /usr/sbin/haproxy
Jan 26 13:35:25 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [WARNING]  (255759) : Exiting Master process...
Jan 26 13:35:25 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [ALERT]    (255759) : Current worker (255761) exited with code 143 (Terminated)
Jan 26 13:35:25 np0005596062 neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7[255755]: [WARNING]  (255759) : All workers exited. Exiting... (0)
Jan 26 13:35:25 np0005596062 systemd[1]: libpod-b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847.scope: Deactivated successfully.
Jan 26 13:35:25 np0005596062 podman[256377]: 2026-01-26 18:35:25.539712256 +0000 UTC m=+0.052652776 container died b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 26 13:35:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:25.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:25 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847-userdata-shm.mount: Deactivated successfully.
Jan 26 13:35:25 np0005596062 systemd[1]: var-lib-containers-storage-overlay-e90b7756508072f255cd0b0b1fa8ce46b713bd323648c863865fe0de151f7d85-merged.mount: Deactivated successfully.
Jan 26 13:35:25 np0005596062 podman[256377]: 2026-01-26 18:35:25.579370714 +0000 UTC m=+0.092311224 container cleanup b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 13:35:25 np0005596062 systemd[1]: libpod-conmon-b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847.scope: Deactivated successfully.
Jan 26 13:35:25 np0005596062 kernel: tap8320c871-95: entered promiscuous mode
Jan 26 13:35:25 np0005596062 kernel: tap8320c871-95 (unregistering): left promiscuous mode
Jan 26 13:35:25 np0005596062 NetworkManager[48993]: <info>  [1769452525.6410] manager: (tap8320c871-95): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00199|binding|INFO|Claiming lport 8320c871-95fc-4422-a1de-f44442d03880 for this chassis.
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00200|binding|INFO|8320c871-95fc-4422-a1de-f44442d03880: Claiming fa:16:3e:c6:40:c3 10.100.0.9
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.645 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00201|binding|INFO|Setting lport 8320c871-95fc-4422-a1de-f44442d03880 ovn-installed in OVS
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00202|if_status|INFO|Dropped 2 log messages in last 76 seconds (most recently, 76 seconds ago) due to excessive rate
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00203|if_status|INFO|Not setting lport 8320c871-95fc-4422-a1de-f44442d03880 down as sb is readonly
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.661 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.665 227317 INFO nova.virt.libvirt.driver [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Instance destroyed successfully.#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.665 227317 DEBUG nova.objects.instance [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lazy-loading 'resources' on Instance uuid f80d1b91-bc87-418e-aa99-016d72cd668f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:35:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:35:25Z|00204|binding|INFO|Releasing lport 8320c871-95fc-4422-a1de-f44442d03880 from this chassis (sb_readonly=0)
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.667 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:40:c3 10.100.0.9'], port_security=['fa:16:3e:c6:40:c3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f80d1b91-bc87-418e-aa99-016d72cd668f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d8fc25e3f054e988a715ec90a59c8d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c7a93f7-da8c-4aec-a2ab-f853885c84e4 fcb5eda1-9949-4ddf-981f-5bf7d419bddd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62ca4ea7-3ca5-45be-aa51-7c1f07dca09c, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=8320c871-95fc-4422-a1de-f44442d03880) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:35:25 np0005596062 podman[256406]: 2026-01-26 18:35:25.678055216 +0000 UTC m=+0.079634685 container remove b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.678 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.679 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:40:c3 10.100.0.9'], port_security=['fa:16:3e:c6:40:c3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f80d1b91-bc87-418e-aa99-016d72cd668f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d8fc25e3f054e988a715ec90a59c8d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c7a93f7-da8c-4aec-a2ab-f853885c84e4 fcb5eda1-9949-4ddf-981f-5bf7d419bddd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62ca4ea7-3ca5-45be-aa51-7c1f07dca09c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=8320c871-95fc-4422-a1de-f44442d03880) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.682 227317 DEBUG nova.virt.libvirt.vif [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:34:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-62938558-access_point-795448959',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-62938558-acce',id=24,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLJiWXodl7Hxj/wweYnSnHc0FwF/SxnekNPyxp62l3q40shVsfGDjmtDdVKQ0kP9OsoTDawgSo4Uj+q+UwvV4zDAKvLswhlqAYblh2u6w9Nv3LR2wklf01sjr7GPx4j6gg==',key_name='tempest-TestSecurityGroupsBasicOps-1930721353',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:34:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8d8fc25e3f054e988a715ec90a59c8d0',ramdisk_id='',reservation_id='r-6xlv7w50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-62938558',owner_user_name='tempest-TestSecurityGroupsBasicOps-62938558-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:34:46Z,user_data=None,user_id='95ada1688fc843cb979bd6c75b517e4a',uuid=f80d1b91-bc87-418e-aa99-016d72cd668f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.683 227317 DEBUG nova.network.os_vif_util [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Converting VIF {"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.683 227317 DEBUG nova.network.os_vif_util [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.683 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb5bb0c-2ade-43a3-bd4c-6930c592e20f]: (4, ('Mon Jan 26 06:35:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7 (b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847)\nb42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847\nMon Jan 26 06:35:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7 (b42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847)\nb42bf8e6a7a28d08a858ffdbb926888cefc66b25bd1984552983913f8fce8847\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.684 227317 DEBUG os_vif [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.685 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5108ac16-6f72-4f66-9570-dd5bb37588b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.686 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.686 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaca06a26-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.686 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8320c871-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.687 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 kernel: tapaca06a26-10: left promiscuous mode
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.689 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.700 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:25 np0005596062 nova_compute[227313]: 2026-01-26 18:35:25.702 227317 INFO os_vif [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:40:c3,bridge_name='br-int',has_traffic_filtering=True,id=8320c871-95fc-4422-a1de-f44442d03880,network=Network(aca06a26-1ace-452c-b833-7d0f7b878fe7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8320c871-95')#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.702 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5197fc-23d6-4c68-88cb-4683d0221085]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.714 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[598b931d-df95-43ff-8014-033eea8d9a77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.715 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[59df7d2a-aec2-424d-b6c4-4fad5a5b655a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.730 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[6a9642e0-eaeb-4abb-ba3e-d1a8660d7d24]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624485, 'reachable_time': 33987, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256441, 'error': None, 'target': 'ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 systemd[1]: run-netns-ovnmeta\x2daca06a26\x2d1ace\x2d452c\x2db833\x2d7d0f7b878fe7.mount: Deactivated successfully.
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.736 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aca06a26-1ace-452c-b833-7d0f7b878fe7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.736 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[a3248cdc-0052-4dc8-b64e-8759497a8ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.737 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 8320c871-95fc-4422-a1de-f44442d03880 in datapath aca06a26-1ace-452c-b833-7d0f7b878fe7 unbound from our chassis#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.738 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aca06a26-1ace-452c-b833-7d0f7b878fe7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.739 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[aa709e0d-f12a-427d-89c0-5e80624885da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.740 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 8320c871-95fc-4422-a1de-f44442d03880 in datapath aca06a26-1ace-452c-b833-7d0f7b878fe7 unbound from our chassis#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.742 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aca06a26-1ace-452c-b833-7d0f7b878fe7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:35:25 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:35:25.742 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[79cbdc31-f37d-43f3-aeb3-4d0483eb7948]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:35:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:26 np0005596062 nova_compute[227313]: 2026-01-26 18:35:26.993 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:26 np0005596062 nova_compute[227313]: 2026-01-26 18:35:26.993 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:26 np0005596062 nova_compute[227313]: 2026-01-26 18:35:26.994 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:35:26 np0005596062 nova_compute[227313]: 2026-01-26 18:35:26.994 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:35:27 np0005596062 nova_compute[227313]: 2026-01-26 18:35:27.164 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:35:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:27.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:35:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:27.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:28 np0005596062 nova_compute[227313]: 2026-01-26 18:35:28.279 227317 INFO nova.virt.libvirt.driver [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Deleting instance files /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f_del#033[00m
Jan 26 13:35:28 np0005596062 nova_compute[227313]: 2026-01-26 18:35:28.280 227317 INFO nova.virt.libvirt.driver [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Deletion of /var/lib/nova/instances/f80d1b91-bc87-418e-aa99-016d72cd668f_del complete#033[00m
Jan 26 13:35:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:29.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.339 227317 DEBUG nova.compute.manager [req-c5cd6493-709e-4ce8-9604-fca0752219ad req-57b501ee-3d4a-42df-8cf9-4692fb92713f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-vif-unplugged-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.339 227317 DEBUG oslo_concurrency.lockutils [req-c5cd6493-709e-4ce8-9604-fca0752219ad req-57b501ee-3d4a-42df-8cf9-4692fb92713f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.340 227317 DEBUG oslo_concurrency.lockutils [req-c5cd6493-709e-4ce8-9604-fca0752219ad req-57b501ee-3d4a-42df-8cf9-4692fb92713f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.340 227317 DEBUG oslo_concurrency.lockutils [req-c5cd6493-709e-4ce8-9604-fca0752219ad req-57b501ee-3d4a-42df-8cf9-4692fb92713f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.340 227317 DEBUG nova.compute.manager [req-c5cd6493-709e-4ce8-9604-fca0752219ad req-57b501ee-3d4a-42df-8cf9-4692fb92713f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] No waiting events found dispatching network-vif-unplugged-8320c871-95fc-4422-a1de-f44442d03880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.340 227317 DEBUG nova.compute.manager [req-c5cd6493-709e-4ce8-9604-fca0752219ad req-57b501ee-3d4a-42df-8cf9-4692fb92713f 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-vif-unplugged-8320c871-95fc-4422-a1de-f44442d03880 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.425 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.425 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.425 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.426 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.426 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.426 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.426 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.427 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.427 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.538 227317 INFO nova.compute.manager [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Took 4.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.539 227317 DEBUG oslo.service.loopingcall [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.540 227317 DEBUG nova.compute.manager [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:35:29 np0005596062 nova_compute[227313]: 2026-01-26 18:35:29.540 227317 DEBUG nova.network.neutron [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:35:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:29.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:30 np0005596062 nova_compute[227313]: 2026-01-26 18:35:30.688 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:31.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:31.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:31 np0005596062 nova_compute[227313]: 2026-01-26 18:35:31.814 227317 DEBUG nova.compute.manager [req-c3d2504b-d556-4b90-b1f9-d15a8bc17bc3 req-b83b519a-68d6-4cad-b947-21e2277c71f8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:35:31 np0005596062 nova_compute[227313]: 2026-01-26 18:35:31.815 227317 DEBUG oslo_concurrency.lockutils [req-c3d2504b-d556-4b90-b1f9-d15a8bc17bc3 req-b83b519a-68d6-4cad-b947-21e2277c71f8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:31 np0005596062 nova_compute[227313]: 2026-01-26 18:35:31.816 227317 DEBUG oslo_concurrency.lockutils [req-c3d2504b-d556-4b90-b1f9-d15a8bc17bc3 req-b83b519a-68d6-4cad-b947-21e2277c71f8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:31 np0005596062 nova_compute[227313]: 2026-01-26 18:35:31.816 227317 DEBUG oslo_concurrency.lockutils [req-c3d2504b-d556-4b90-b1f9-d15a8bc17bc3 req-b83b519a-68d6-4cad-b947-21e2277c71f8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:31 np0005596062 nova_compute[227313]: 2026-01-26 18:35:31.816 227317 DEBUG nova.compute.manager [req-c3d2504b-d556-4b90-b1f9-d15a8bc17bc3 req-b83b519a-68d6-4cad-b947-21e2277c71f8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] No waiting events found dispatching network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:35:31 np0005596062 nova_compute[227313]: 2026-01-26 18:35:31.816 227317 WARNING nova.compute.manager [req-c3d2504b-d556-4b90-b1f9-d15a8bc17bc3 req-b83b519a-68d6-4cad-b947-21e2277c71f8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received unexpected event network-vif-plugged-8320c871-95fc-4422-a1de-f44442d03880 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.140 227317 DEBUG nova.network.neutron [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updated VIF entry in instance network info cache for port 8320c871-95fc-4422-a1de-f44442d03880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.141 227317 DEBUG nova.network.neutron [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updating instance_info_cache with network_info: [{"id": "8320c871-95fc-4422-a1de-f44442d03880", "address": "fa:16:3e:c6:40:c3", "network": {"id": "aca06a26-1ace-452c-b833-7d0f7b878fe7", "bridge": "br-int", "label": "tempest-network-smoke--1123000505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8d8fc25e3f054e988a715ec90a59c8d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8320c871-95", "ovs_interfaceid": "8320c871-95fc-4422-a1de-f44442d03880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.166 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.292 227317 DEBUG oslo_concurrency.lockutils [req-5e3fb492-7971-45ff-85a0-497d2f51c9ef req-7dfe2a6f-6f00-4771-a428-6560f16b7775 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-f80d1b91-bc87-418e-aa99-016d72cd668f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.392 227317 DEBUG nova.network.neutron [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.401 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.513 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.530 227317 INFO nova.compute.manager [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Took 2.99 seconds to deallocate network for instance.#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.663 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.664 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.729 227317 DEBUG oslo_concurrency.processutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:35:32 np0005596062 nova_compute[227313]: 2026-01-26 18:35:32.893 227317 DEBUG nova.compute.manager [req-36e92e17-f9f6-45a9-b570-731ccf9f67b3 req-53b6f418-aed3-4304-96d0-3d32fdfbd6f3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Received event network-vif-deleted-8320c871-95fc-4422-a1de-f44442d03880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:35:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:35:33 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3642783861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:35:33 np0005596062 nova_compute[227313]: 2026-01-26 18:35:33.171 227317 DEBUG oslo_concurrency.processutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:35:33 np0005596062 nova_compute[227313]: 2026-01-26 18:35:33.177 227317 DEBUG nova.compute.provider_tree [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:35:33 np0005596062 nova_compute[227313]: 2026-01-26 18:35:33.202 227317 DEBUG nova.scheduler.client.report [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:35:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:33.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:33 np0005596062 nova_compute[227313]: 2026-01-26 18:35:33.245 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:33 np0005596062 nova_compute[227313]: 2026-01-26 18:35:33.298 227317 INFO nova.scheduler.client.report [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Deleted allocations for instance f80d1b91-bc87-418e-aa99-016d72cd668f#033[00m
Jan 26 13:35:33 np0005596062 nova_compute[227313]: 2026-01-26 18:35:33.428 227317 DEBUG oslo_concurrency.lockutils [None req-80319f31-2555-4569-8289-fe3345196649 95ada1688fc843cb979bd6c75b517e4a 8d8fc25e3f054e988a715ec90a59c8d0 - - default default] Lock "f80d1b91-bc87-418e-aa99-016d72cd668f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:33.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:35.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:35.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:35 np0005596062 nova_compute[227313]: 2026-01-26 18:35:35.690 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:37 np0005596062 nova_compute[227313]: 2026-01-26 18:35:37.168 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:37.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:37.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:39.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:39.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:40 np0005596062 nova_compute[227313]: 2026-01-26 18:35:40.661 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452525.6604059, f80d1b91-bc87-418e-aa99-016d72cd668f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:35:40 np0005596062 nova_compute[227313]: 2026-01-26 18:35:40.662 227317 INFO nova.compute.manager [-] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:35:40 np0005596062 nova_compute[227313]: 2026-01-26 18:35:40.687 227317 DEBUG nova.compute.manager [None req-2ee9f69a-6921-4d37-803d-1cad47d423de - - - - - -] [instance: f80d1b91-bc87-418e-aa99-016d72cd668f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:35:40 np0005596062 nova_compute[227313]: 2026-01-26 18:35:40.691 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:41.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:41.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:42 np0005596062 nova_compute[227313]: 2026-01-26 18:35:42.171 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:42 np0005596062 podman[256528]: 2026-01-26 18:35:42.840731991 +0000 UTC m=+0.051846834 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:35:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:43.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:43.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:45.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:45.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:45 np0005596062 nova_compute[227313]: 2026-01-26 18:35:45.692 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:47 np0005596062 nova_compute[227313]: 2026-01-26 18:35:47.173 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:47.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:47.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:49.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:49.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:50 np0005596062 nova_compute[227313]: 2026-01-26 18:35:50.740 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:35:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:35:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:51.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:52 np0005596062 nova_compute[227313]: 2026-01-26 18:35:52.176 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:53.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:53.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:53 np0005596062 podman[256555]: 2026-01-26 18:35:53.859273946 +0000 UTC m=+0.074148008 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 26 13:35:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:55.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:55.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:55 np0005596062 nova_compute[227313]: 2026-01-26 18:35:55.742 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:35:57 np0005596062 nova_compute[227313]: 2026-01-26 18:35:57.177 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:35:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:57.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:35:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:57.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:35:57 np0005596062 nova_compute[227313]: 2026-01-26 18:35:57.976 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:57 np0005596062 nova_compute[227313]: 2026-01-26 18:35:57.976 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:57 np0005596062 nova_compute[227313]: 2026-01-26 18:35:57.994 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.095 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.096 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.121 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.121 227317 INFO nova.compute.claims [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.272 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:35:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:35:58 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1216368179' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.730 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.735 227317 DEBUG nova.compute.provider_tree [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.768 227317 DEBUG nova.scheduler.client.report [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.814 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.815 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.870 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.870 227317 DEBUG nova.network.neutron [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.891 227317 INFO nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:35:58 np0005596062 nova_compute[227313]: 2026-01-26 18:35:58.907 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.021 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.022 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.023 227317 INFO nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Creating image(s)#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.054 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.086 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.118 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.123 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.199 227317 DEBUG nova.policy [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffa1cd7ba9e543f78f2ef48c2a7a67a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '301bad5c2066428fa7f214024672bf92', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.202 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.204 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.204 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.204 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.230 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:35:59 np0005596062 nova_compute[227313]: 2026-01-26 18:35:59.234 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 46f89010-5c5d-4c32-ba88-951b6d640927_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:35:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:35:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:35:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:35:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:35:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:35:59.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:00 np0005596062 nova_compute[227313]: 2026-01-26 18:36:00.743 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:01.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:01.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:02 np0005596062 nova_compute[227313]: 2026-01-26 18:36:02.160 227317 DEBUG nova.network.neutron [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Successfully created port: ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:36:02 np0005596062 nova_compute[227313]: 2026-01-26 18:36:02.178 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:03.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:03.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:03 np0005596062 ceph-mds[83671]: mds.beacon.cephfs.compute-2.oqvedy missed beacon ack from the monitors
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.044 227317 DEBUG nova.network.neutron [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Successfully updated port: ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.099 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.099 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquired lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.099 227317 DEBUG nova.network.neutron [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.250 227317 DEBUG nova.compute.manager [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-changed-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.251 227317 DEBUG nova.compute.manager [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Refreshing instance network info cache due to event network-changed-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.251 227317 DEBUG oslo_concurrency.lockutils [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:36:04 np0005596062 nova_compute[227313]: 2026-01-26 18:36:04.389 227317 DEBUG nova.network.neutron [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:36:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:05.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:05.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:05 np0005596062 nova_compute[227313]: 2026-01-26 18:36:05.653 227317 DEBUG nova.network.neutron [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:36:05 np0005596062 nova_compute[227313]: 2026-01-26 18:36:05.744 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:05 np0005596062 nova_compute[227313]: 2026-01-26 18:36:05.818 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Releasing lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:36:05 np0005596062 nova_compute[227313]: 2026-01-26 18:36:05.818 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance network_info: |[{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:36:05 np0005596062 nova_compute[227313]: 2026-01-26 18:36:05.818 227317 DEBUG oslo_concurrency.lockutils [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:36:05 np0005596062 nova_compute[227313]: 2026-01-26 18:36:05.819 227317 DEBUG nova.network.neutron [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Refreshing network info cache for port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:36:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).paxos(paxos updating c 3515..4251) lease_timeout -- calling new election
Jan 26 13:36:06 np0005596062 ceph-mon[77178]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 26 13:36:06 np0005596062 ceph-mon[77178]: paxos.1).electionLogic(18) init, last seen epoch 18
Jan 26 13:36:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:36:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:36:07 np0005596062 nova_compute[227313]: 2026-01-26 18:36:07.180 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:07.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:07.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:07 np0005596062 ceph-mds[83671]: mds.beacon.cephfs.compute-2.oqvedy missed beacon ack from the monitors
Jan 26 13:36:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:36:09.185 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:36:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:36:09.186 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:36:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:36:09.186 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:36:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:09.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:36:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:09.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:36:09 np0005596062 nova_compute[227313]: 2026-01-26 18:36:09.696 227317 DEBUG nova.network.neutron [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updated VIF entry in instance network info cache for port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:36:09 np0005596062 nova_compute[227313]: 2026-01-26 18:36:09.697 227317 DEBUG nova.network.neutron [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:36:09 np0005596062 nova_compute[227313]: 2026-01-26 18:36:09.724 227317 DEBUG oslo_concurrency.lockutils [req-97b9d8ff-5f87-478a-a6d8-340a84497184 req-be8aa228-b020-4d08-b288-a631fc15ab35 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:36:10 np0005596062 nova_compute[227313]: 2026-01-26 18:36:10.746 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:11.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:36:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:11.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:12 np0005596062 nova_compute[227313]: 2026-01-26 18:36:12.182 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:12 np0005596062 ceph-mon[77178]: mon.compute-2 calling monitor election
Jan 26 13:36:12 np0005596062 ceph-mon[77178]: mon.compute-0 calling monitor election
Jan 26 13:36:12 np0005596062 ceph-mon[77178]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 26 13:36:12 np0005596062 ceph-mon[77178]: Health check failed: 1/3 mons down, quorum compute-0,compute-2 (MON_DOWN)
Jan 26 13:36:12 np0005596062 ceph-mon[77178]: Health detail: HEALTH_WARN 1/3 mons down, quorum compute-0,compute-2
Jan 26 13:36:12 np0005596062 ceph-mon[77178]: [WRN] MON_DOWN: 1/3 mons down, quorum compute-0,compute-2
Jan 26 13:36:12 np0005596062 ceph-mon[77178]:    mon.compute-1 (rank 2) addr [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] is down (out of quorum)
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.082 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.082 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.082 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:36:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:13.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:36:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2436705316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.510 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:36:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:13.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:13 np0005596062 podman[256777]: 2026-01-26 18:36:13.619606439 +0000 UTC m=+0.057526075 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.685 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.686 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4746MB free_disk=20.988269805908203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.686 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.686 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.855 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 46f89010-5c5d-4c32-ba88-951b6d640927 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.856 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.856 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:36:13 np0005596062 nova_compute[227313]: 2026-01-26 18:36:13.899 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:36:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:36:14 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1971403726' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:36:14 np0005596062 nova_compute[227313]: 2026-01-26 18:36:14.312 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:36:14 np0005596062 nova_compute[227313]: 2026-01-26 18:36:14.320 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:36:14 np0005596062 nova_compute[227313]: 2026-01-26 18:36:14.346 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:36:14 np0005596062 nova_compute[227313]: 2026-01-26 18:36:14.376 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:36:14 np0005596062 nova_compute[227313]: 2026-01-26 18:36:14.376 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:36:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:15.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:15.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:15 np0005596062 nova_compute[227313]: 2026-01-26 18:36:15.747 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 13:36:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:36:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:36:17 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:36:17 np0005596062 nova_compute[227313]: 2026-01-26 18:36:17.184 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:17.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:17.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:19.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:19.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:36:20.143 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:36:20 np0005596062 nova_compute[227313]: 2026-01-26 18:36:20.144 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:20 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:36:20.145 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:36:20 np0005596062 nova_compute[227313]: 2026-01-26 18:36:20.373 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:20 np0005596062 ovn_controller[133984]: 2026-01-26T18:36:20Z|00205|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.571847) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452580571933, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 2436, "num_deletes": 255, "total_data_size": 5804875, "memory_usage": 5880160, "flush_reason": "Manual Compaction"}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452580597849, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 3773095, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41761, "largest_seqno": 44192, "table_properties": {"data_size": 3763256, "index_size": 6205, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 21038, "raw_average_key_size": 20, "raw_value_size": 3743310, "raw_average_value_size": 3706, "num_data_blocks": 270, "num_entries": 1010, "num_filter_entries": 1010, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452363, "oldest_key_time": 1769452363, "file_creation_time": 1769452580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 26146 microseconds, and 9999 cpu microseconds.
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.597999) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 3773095 bytes OK
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.598056) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.599880) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.599897) EVENT_LOG_v1 {"time_micros": 1769452580599891, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.599915) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 5794204, prev total WAL file size 5794204, number of live WAL files 2.
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.601350) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(3684KB)], [81(8664KB)]
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452580601397, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 12645485, "oldest_snapshot_seqno": -1}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6714 keys, 10681489 bytes, temperature: kUnknown
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452580666817, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 10681489, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10636830, "index_size": 26755, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16837, "raw_key_size": 172082, "raw_average_key_size": 25, "raw_value_size": 10516477, "raw_average_value_size": 1566, "num_data_blocks": 1071, "num_entries": 6714, "num_filter_entries": 6714, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.667152) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 10681489 bytes
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.668684) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.0 rd, 163.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 8.5 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 7251, records dropped: 537 output_compression: NoCompression
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.668735) EVENT_LOG_v1 {"time_micros": 1769452580668721, "job": 50, "event": "compaction_finished", "compaction_time_micros": 65527, "compaction_time_cpu_micros": 23913, "output_level": 6, "num_output_files": 1, "total_output_size": 10681489, "num_input_records": 7251, "num_output_records": 6714, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452580669921, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452580672725, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.601237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.672805) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.672811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.672817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.672821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:20 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:20.672824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:20 np0005596062 nova_compute[227313]: 2026-01-26 18:36:20.784 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:21 np0005596062 nova_compute[227313]: 2026-01-26 18:36:21.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:21 np0005596062 nova_compute[227313]: 2026-01-26 18:36:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:21 np0005596062 nova_compute[227313]: 2026-01-26 18:36:21.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:36:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:21.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:21.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:22 np0005596062 nova_compute[227313]: 2026-01-26 18:36:22.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:22 np0005596062 nova_compute[227313]: 2026-01-26 18:36:22.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:36:22 np0005596062 nova_compute[227313]: 2026-01-26 18:36:22.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:36:22 np0005596062 nova_compute[227313]: 2026-01-26 18:36:22.186 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:22 np0005596062 nova_compute[227313]: 2026-01-26 18:36:22.367 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 13:36:22 np0005596062 nova_compute[227313]: 2026-01-26 18:36:22.368 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:36:22 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:36:22 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:36:23 np0005596062 nova_compute[227313]: 2026-01-26 18:36:23.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:23.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:23.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:24 np0005596062 nova_compute[227313]: 2026-01-26 18:36:24.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:24 np0005596062 podman[257056]: 2026-01-26 18:36:24.867419102 +0000 UTC m=+0.080596531 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 13:36:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:25.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:25.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:25 np0005596062 nova_compute[227313]: 2026-01-26 18:36:25.786 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:26 np0005596062 nova_compute[227313]: 2026-01-26 18:36:26.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:26 np0005596062 nova_compute[227313]: 2026-01-26 18:36:26.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:27 np0005596062 nova_compute[227313]: 2026-01-26 18:36:27.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:36:27 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:36:27.147 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:36:27 np0005596062 nova_compute[227313]: 2026-01-26 18:36:27.188 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:27.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:27.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:29.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:29.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:30 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:30 np0005596062 nova_compute[227313]: 2026-01-26 18:36:30.826 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:31 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:31.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:36:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:31.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:36:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:32 np0005596062 nova_compute[227313]: 2026-01-26 18:36:32.189 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:32 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:33.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:33 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:33.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:34 np0005596062 ceph-mon[77178]: Health check failed: 1 slow ops, oldest one blocked for 31 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:36:34 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:35.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:35 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:35.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:35 np0005596062 nova_compute[227313]: 2026-01-26 18:36:35.827 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:36 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:37 np0005596062 nova_compute[227313]: 2026-01-26 18:36:37.191 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:37.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:37 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:37.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:38 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:39.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:39 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:39.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:36:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1536386802' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:36:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:36:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1536386802' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:36:40 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:40 np0005596062 nova_compute[227313]: 2026-01-26 18:36:40.829 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:36:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:41.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.564763) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452601564807, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 503, "num_deletes": 258, "total_data_size": 603396, "memory_usage": 614200, "flush_reason": "Manual Compaction"}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452601656136, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 397404, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44197, "largest_seqno": 44695, "table_properties": {"data_size": 394756, "index_size": 684, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6583, "raw_average_key_size": 18, "raw_value_size": 389257, "raw_average_value_size": 1099, "num_data_blocks": 29, "num_entries": 354, "num_filter_entries": 354, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452581, "oldest_key_time": 1769452581, "file_creation_time": 1769452601, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 91422 microseconds, and 2387 cpu microseconds.
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:36:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:41.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.656184) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 397404 bytes OK
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.656206) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.689280) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.689312) EVENT_LOG_v1 {"time_micros": 1769452601689305, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.689331) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 600349, prev total WAL file size 893998, number of live WAL files 2.
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.689751) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323535' seq:72057594037927935, type:22 .. '6C6F676D0031353039' seq:0, type:0; will stop at (end)
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(388KB)], [84(10MB)]
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452601689772, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11078893, "oldest_snapshot_seqno": -1}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6541 keys, 10960159 bytes, temperature: kUnknown
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452601829319, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 10960159, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10915845, "index_size": 26858, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 169493, "raw_average_key_size": 25, "raw_value_size": 10797610, "raw_average_value_size": 1650, "num_data_blocks": 1073, "num_entries": 6541, "num_filter_entries": 6541, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452601, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.830552) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 10960159 bytes
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.832509) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 79.4 rd, 78.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.2 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(55.5) write-amplify(27.6) OK, records in: 7068, records dropped: 527 output_compression: NoCompression
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.832590) EVENT_LOG_v1 {"time_micros": 1769452601832559, "job": 52, "event": "compaction_finished", "compaction_time_micros": 139604, "compaction_time_cpu_micros": 23675, "output_level": 6, "num_output_files": 1, "total_output_size": 10960159, "num_input_records": 7068, "num_output_records": 6541, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452601833184, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: Health check update: 1 slow ops, oldest one blocked for 37 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452601836919, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.689712) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.836983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.836988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.836990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.836992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:36:41.836993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:36:42 np0005596062 nova_compute[227313]: 2026-01-26 18:36:42.194 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:42 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:42 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:43.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:43.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:43 np0005596062 podman[257142]: 2026-01-26 18:36:43.845577781 +0000 UTC m=+0.059703454 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:36:43 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:44 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:45.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:45.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:45 np0005596062 nova_compute[227313]: 2026-01-26 18:36:45.847 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:45 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:46 np0005596062 ceph-mon[77178]: Health check update: 1 slow ops, oldest one blocked for 41 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:36:46 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:47 np0005596062 nova_compute[227313]: 2026-01-26 18:36:47.195 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:47.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:47.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:48 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:49.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:49.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:49 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:50 np0005596062 nova_compute[227313]: 2026-01-26 18:36:50.848 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:50 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:51.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:51.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:51 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:51 np0005596062 ceph-mon[77178]: Health check update: 1 slow ops, oldest one blocked for 46 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:36:52 np0005596062 nova_compute[227313]: 2026-01-26 18:36:52.197 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:52 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:53.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:53.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:53 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:54 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:55.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:55.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:55 np0005596062 nova_compute[227313]: 2026-01-26 18:36:55.850 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:55 np0005596062 podman[257167]: 2026-01-26 18:36:55.888377898 +0000 UTC m=+0.095059256 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 13:36:56 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:36:57 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:57 np0005596062 ceph-mon[77178]: Health check update: 1 slow ops, oldest one blocked for 51 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:36:57 np0005596062 nova_compute[227313]: 2026-01-26 18:36:57.199 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:36:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:57.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:57.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:36:58 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:59 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:36:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:36:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:36:59.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:36:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:36:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:36:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:36:59.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:00 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:37:00 np0005596062 nova_compute[227313]: 2026-01-26 18:37:00.880 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:01 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:37:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:01.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:01.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:02 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:37:02 np0005596062 ceph-mon[77178]: Health check update: 1 slow ops, oldest one blocked for 56 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.201 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.745 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 46f89010-5c5d-4c32-ba88-951b6d640927_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 63.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.813 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] resizing rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.921 227317 DEBUG nova.objects.instance [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'migration_context' on Instance uuid 46f89010-5c5d-4c32-ba88-951b6d640927 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.934 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.935 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Ensure instance console log exists: /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.935 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.936 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.936 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.938 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Start _get_guest_xml network_info=[{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.942 227317 WARNING nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.948 227317 DEBUG nova.virt.libvirt.host [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.948 227317 DEBUG nova.virt.libvirt.host [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.951 227317 DEBUG nova.virt.libvirt.host [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.951 227317 DEBUG nova.virt.libvirt.host [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.952 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.952 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.952 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.953 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.953 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.953 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.953 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.953 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.954 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.954 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.954 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.954 227317 DEBUG nova.virt.hardware [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:37:02 np0005596062 nova_compute[227313]: 2026-01-26 18:37:02.957 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: 1 slow requests (by type [ 'started' : 1 ] most affected pool [ 'vms' : 1 ])
Jan 26 13:37:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:37:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:03.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1986890856' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.404 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.433 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.438 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:37:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:03.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1393258661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:37:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1271592668' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.848 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.850 227317 DEBUG nova.virt.libvirt.vif [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:35:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096981336',display_name='tempest-TestNetworkAdvancedServerOps-server-2096981336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096981336',id=25,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQJz/YobTJceZvzJVuWmJsMcwBHeXUK3qg9BboX4DIZ5bVn3L/CtROKlHp/+NBsJy5WBPfnAbNkl+SqE4ICwsBMnEMwqWyuIpclQAnUTz3DIA/5r+AyFQgQJNuNY5sFTA==',key_name='tempest-TestNetworkAdvancedServerOps-185858677',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-7t4115p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:35:58Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=46f89010-5c5d-4c32-ba88-951b6d640927,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.850 227317 DEBUG nova.network.os_vif_util [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.851 227317 DEBUG nova.network.os_vif_util [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.852 227317 DEBUG nova.objects.instance [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'pci_devices' on Instance uuid 46f89010-5c5d-4c32-ba88-951b6d640927 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.879 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <uuid>46f89010-5c5d-4c32-ba88-951b6d640927</uuid>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <name>instance-00000019</name>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2096981336</nova:name>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:37:02</nova:creationTime>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:user uuid="ffa1cd7ba9e543f78f2ef48c2a7a67a2">tempest-TestNetworkAdvancedServerOps-1357272614-project-member</nova:user>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:project uuid="301bad5c2066428fa7f214024672bf92">tempest-TestNetworkAdvancedServerOps-1357272614</nova:project>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <nova:port uuid="ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <entry name="serial">46f89010-5c5d-4c32-ba88-951b6d640927</entry>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <entry name="uuid">46f89010-5c5d-4c32-ba88-951b6d640927</entry>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/46f89010-5c5d-4c32-ba88-951b6d640927_disk">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/46f89010-5c5d-4c32-ba88-951b6d640927_disk.config">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:d8:b9:47"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <target dev="tapec4b7772-c4"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/console.log" append="off"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:37:03 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:37:03 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:37:03 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:37:03 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.881 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Preparing to wait for external event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.882 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.883 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.883 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.885 227317 DEBUG nova.virt.libvirt.vif [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:35:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096981336',display_name='tempest-TestNetworkAdvancedServerOps-server-2096981336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096981336',id=25,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQJz/YobTJceZvzJVuWmJsMcwBHeXUK3qg9BboX4DIZ5bVn3L/CtROKlHp/+NBsJy5WBPfnAbNkl+SqE4ICwsBMnEMwqWyuIpclQAnUTz3DIA/5r+AyFQgQJNuNY5sFTA==',key_name='tempest-TestNetworkAdvancedServerOps-185858677',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-7t4115p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:35:58Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=46f89010-5c5d-4c32-ba88-951b6d640927,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.886 227317 DEBUG nova.network.os_vif_util [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.887 227317 DEBUG nova.network.os_vif_util [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.888 227317 DEBUG os_vif [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.890 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.891 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.892 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.896 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.897 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec4b7772-c4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.898 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec4b7772-c4, col_values=(('external_ids', {'iface-id': 'ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:b9:47', 'vm-uuid': '46f89010-5c5d-4c32-ba88-951b6d640927'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.900 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:03 np0005596062 NetworkManager[48993]: <info>  [1769452623.9013] manager: (tapec4b7772-c4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.902 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.906 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.906 227317 INFO os_vif [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4')#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.947 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.948 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.948 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No VIF found with MAC fa:16:3e:d8:b9:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.948 227317 INFO nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Using config drive#033[00m
Jan 26 13:37:03 np0005596062 nova_compute[227313]: 2026-01-26 18:37:03.972 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:37:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:05.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:05.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:05 np0005596062 nova_compute[227313]: 2026-01-26 18:37:05.736 227317 INFO nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Creating config drive at /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/disk.config#033[00m
Jan 26 13:37:05 np0005596062 nova_compute[227313]: 2026-01-26 18:37:05.740 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf75lg1xn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:37:05 np0005596062 nova_compute[227313]: 2026-01-26 18:37:05.884 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf75lg1xn" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:05 np0005596062 nova_compute[227313]: 2026-01-26 18:37:05.914 227317 DEBUG nova.storage.rbd_utils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image 46f89010-5c5d-4c32-ba88-951b6d640927_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:37:05 np0005596062 nova_compute[227313]: 2026-01-26 18:37:05.918 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/disk.config 46f89010-5c5d-4c32-ba88-951b6d640927_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.066 227317 DEBUG oslo_concurrency.processutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/disk.config 46f89010-5c5d-4c32-ba88-951b6d640927_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.067 227317 INFO nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Deleting local config drive /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927/disk.config because it was imported into RBD.#033[00m
Jan 26 13:37:06 np0005596062 kernel: tapec4b7772-c4: entered promiscuous mode
Jan 26 13:37:06 np0005596062 NetworkManager[48993]: <info>  [1769452626.1176] manager: (tapec4b7772-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Jan 26 13:37:06 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:06Z|00206|binding|INFO|Claiming lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for this chassis.
Jan 26 13:37:06 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:06Z|00207|binding|INFO|ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4: Claiming fa:16:3e:d8:b9:47 10.100.0.12
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.118 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.123 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.132 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:b9:47 10.100.0.12'], port_security=['fa:16:3e:d8:b9:47 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46f89010-5c5d-4c32-ba88-951b6d640927', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da815054-d79b-464b-a232-6b8265207d78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f896585a-50c1-4248-9fde-f1b2702fb2aa, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.133 143929 INFO neutron.agent.ovn.metadata.agent [-] Port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 in datapath 25ae0294-d511-4bdd-8a1f-f103179c52b7 bound to our chassis#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.134 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25ae0294-d511-4bdd-8a1f-f103179c52b7#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.144 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e38ebc-46c0-4aee-8609-1b01dbeec0de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.145 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap25ae0294-d1 in ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.146 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap25ae0294-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.146 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b551b16b-5b96-4494-9fad-20e51317a38e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 systemd-machined[195380]: New machine qemu-19-instance-00000019.
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.147 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0706d8be-05a1-4e68-9445-fd89010e2b80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.159 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[92166ff8-bc10-4f88-ac04-1c8bdf84dc74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.183 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5f898a61-1e45-422d-896c-4007ce0f6965]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 systemd-udevd[257462]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.192 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:06Z|00208|binding|INFO|Setting lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 ovn-installed in OVS
Jan 26 13:37:06 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:06Z|00209|binding|INFO|Setting lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 up in Southbound
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.199 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 NetworkManager[48993]: <info>  [1769452626.2062] device (tapec4b7772-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:37:06 np0005596062 NetworkManager[48993]: <info>  [1769452626.2081] device (tapec4b7772-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.213 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[077cc01c-f521-4c40-a3b5-dd68c51562ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.219 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[12450b33-7251-4ee2-aca0-518a89274352]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 NetworkManager[48993]: <info>  [1769452626.2199] manager: (tap25ae0294-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.244 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3cdefb-3981-4707-8516-d40f1dff790e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.247 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[9c48f234-f05e-4cc6-8de3-bd49ef6a359e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 NetworkManager[48993]: <info>  [1769452626.2647] device (tap25ae0294-d0): carrier: link connected
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.269 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[6a626abb-0173-474d-a941-bd4edf32c55b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.285 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2cbd1aa2-7972-4db6-bc71-08b42c4fce40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25ae0294-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:aa:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638631, 'reachable_time': 42768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257493, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.301 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e977638c-8d0a-4c30-bcf1-bb1e7d08c535]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:aa50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638631, 'tstamp': 638631}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257494, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.315 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2db14307-74da-468f-882d-fc4e6ed328ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25ae0294-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:aa:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638631, 'reachable_time': 42768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257495, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.344 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0f972faa-0d8c-4185-a7b4-fc1dcce20ada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.395 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e030ddf8-13bd-4f32-abfd-cf8273d97a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.396 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25ae0294-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.397 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.397 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25ae0294-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.398 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 NetworkManager[48993]: <info>  [1769452626.3995] manager: (tap25ae0294-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 26 13:37:06 np0005596062 kernel: tap25ae0294-d0: entered promiscuous mode
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.401 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.402 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25ae0294-d0, col_values=(('external_ids', {'iface-id': 'e1b2dc53-fbcf-4b93-9880-910123b0b71c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.403 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:06Z|00210|binding|INFO|Releasing lport e1b2dc53-fbcf-4b93-9880-910123b0b71c from this chassis (sb_readonly=0)
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.421 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.422 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/25ae0294-d511-4bdd-8a1f-f103179c52b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/25ae0294-d511-4bdd-8a1f-f103179c52b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.424 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[139d9d00-2b0f-4ca9-99ac-66d60c9f9035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.424 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-25ae0294-d511-4bdd-8a1f-f103179c52b7
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/25ae0294-d511-4bdd-8a1f-f103179c52b7.pid.haproxy
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 25ae0294-d511-4bdd-8a1f-f103179c52b7
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:37:06 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:06.425 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'env', 'PROCESS_TAG=haproxy-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/25ae0294-d511-4bdd-8a1f-f103179c52b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.620 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452626.6193323, 46f89010-5c5d-4c32-ba88-951b6d640927 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.620 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] VM Started (Lifecycle Event)#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.639 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.644 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452626.6194274, 46f89010-5c5d-4c32-ba88-951b6d640927 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.645 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.662 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.665 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:37:06 np0005596062 nova_compute[227313]: 2026-01-26 18:37:06.685 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:37:06 np0005596062 podman[257569]: 2026-01-26 18:37:06.783257998 +0000 UTC m=+0.045901306 container create da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:37:06 np0005596062 systemd[1]: Started libpod-conmon-da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94.scope.
Jan 26 13:37:06 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:37:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:06 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a04e4c16ed83638fad151ffd6b3fe1e0b9cd792301c093ec50f3884d90650c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:37:06 np0005596062 podman[257569]: 2026-01-26 18:37:06.75784837 +0000 UTC m=+0.020491648 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:37:06 np0005596062 podman[257569]: 2026-01-26 18:37:06.854781665 +0000 UTC m=+0.117424993 container init da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 13:37:06 np0005596062 podman[257569]: 2026-01-26 18:37:06.859424519 +0000 UTC m=+0.122067817 container start da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 26 13:37:06 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [NOTICE]   (257588) : New worker (257590) forked
Jan 26 13:37:06 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [NOTICE]   (257588) : Loading success.
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.076 227317 DEBUG nova.compute.manager [req-0b317dcb-401c-4571-aca2-0b32ca3407ea req-9331f13d-1444-4567-b8ae-460a1c2c6057 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.077 227317 DEBUG oslo_concurrency.lockutils [req-0b317dcb-401c-4571-aca2-0b32ca3407ea req-9331f13d-1444-4567-b8ae-460a1c2c6057 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.077 227317 DEBUG oslo_concurrency.lockutils [req-0b317dcb-401c-4571-aca2-0b32ca3407ea req-9331f13d-1444-4567-b8ae-460a1c2c6057 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.077 227317 DEBUG oslo_concurrency.lockutils [req-0b317dcb-401c-4571-aca2-0b32ca3407ea req-9331f13d-1444-4567-b8ae-460a1c2c6057 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.077 227317 DEBUG nova.compute.manager [req-0b317dcb-401c-4571-aca2-0b32ca3407ea req-9331f13d-1444-4567-b8ae-460a1c2c6057 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Processing event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.078 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.082 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452627.0826676, 46f89010-5c5d-4c32-ba88-951b6d640927 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.083 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.084 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.088 227317 INFO nova.virt.libvirt.driver [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance spawned successfully.#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.089 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.106 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.111 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.113 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.113 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.114 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.114 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.114 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.115 227317 DEBUG nova.virt.libvirt.driver [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.145 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.194 227317 INFO nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Took 68.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.195 227317 DEBUG nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:07 np0005596062 ceph-mon[77178]: Health check update: 1 slow ops, oldest one blocked for 61 sec, osd.0 has slow ops (SLOW_OPS)
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.210 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.252 227317 INFO nova.compute.manager [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Took 69.19 seconds to build instance.#033[00m
Jan 26 13:37:07 np0005596062 nova_compute[227313]: 2026-01-26 18:37:07.265 227317 DEBUG oslo_concurrency.lockutils [None req-73434682-4707-4a11-a776-89334ed2241c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 69.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:07.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:07.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:08 np0005596062 ceph-mon[77178]: Health check cleared: SLOW_OPS (was: 1 slow ops, oldest one blocked for 61 sec, osd.0 has slow ops)
Jan 26 13:37:08 np0005596062 ceph-mon[77178]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 26 13:37:08 np0005596062 ceph-mon[77178]: paxos.1).electionLogic(23) init, last seen epoch 23, mid-election, bumping
Jan 26 13:37:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:37:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:37:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 13:37:08 np0005596062 nova_compute[227313]: 2026-01-26 18:37:08.900 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:09 np0005596062 nova_compute[227313]: 2026-01-26 18:37:09.161 227317 DEBUG nova.compute.manager [req-b5b17ec4-5ce3-48f3-8fb1-305973a08290 req-716330af-765a-465b-8113-d16aa1f27a53 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:09 np0005596062 nova_compute[227313]: 2026-01-26 18:37:09.161 227317 DEBUG oslo_concurrency.lockutils [req-b5b17ec4-5ce3-48f3-8fb1-305973a08290 req-716330af-765a-465b-8113-d16aa1f27a53 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:09 np0005596062 nova_compute[227313]: 2026-01-26 18:37:09.161 227317 DEBUG oslo_concurrency.lockutils [req-b5b17ec4-5ce3-48f3-8fb1-305973a08290 req-716330af-765a-465b-8113-d16aa1f27a53 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:09 np0005596062 nova_compute[227313]: 2026-01-26 18:37:09.162 227317 DEBUG oslo_concurrency.lockutils [req-b5b17ec4-5ce3-48f3-8fb1-305973a08290 req-716330af-765a-465b-8113-d16aa1f27a53 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:09 np0005596062 nova_compute[227313]: 2026-01-26 18:37:09.162 227317 DEBUG nova.compute.manager [req-b5b17ec4-5ce3-48f3-8fb1-305973a08290 req-716330af-765a-465b-8113-d16aa1f27a53 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:37:09 np0005596062 nova_compute[227313]: 2026-01-26 18:37:09.162 227317 WARNING nova.compute.manager [req-b5b17ec4-5ce3-48f3-8fb1-305973a08290 req-716330af-765a-465b-8113-d16aa1f27a53 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received unexpected event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:37:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:09.186 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:09.187 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:09.187 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:09.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:09.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:10 np0005596062 ceph-mon[77178]: mon.compute-0 calling monitor election
Jan 26 13:37:10 np0005596062 ceph-mon[77178]: mon.compute-2 calling monitor election
Jan 26 13:37:10 np0005596062 ceph-mon[77178]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 26 13:37:10 np0005596062 ceph-mon[77178]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum compute-0,compute-2)
Jan 26 13:37:10 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 13:37:11 np0005596062 nova_compute[227313]: 2026-01-26 18:37:11.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:11 np0005596062 nova_compute[227313]: 2026-01-26 18:37:11.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:37:11 np0005596062 ceph-mon[77178]: Health check failed: 9 slow ops, oldest one blocked for 62 sec, mon.compute-1 has slow ops (SLOW_OPS)
Jan 26 13:37:11 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:37:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:11.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:37:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:11.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:37:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:12 np0005596062 nova_compute[227313]: 2026-01-26 18:37:12.213 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.073 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.102 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.103 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.103 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.104 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.104 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:37:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:13.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:37:13 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2237313325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.563 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.644 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000019 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.644 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-00000019 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:37:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:13.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.809 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.810 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4565MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.810 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.811 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.893 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 46f89010-5c5d-4c32-ba88-951b6d640927 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.894 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.894 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.902 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:13 np0005596062 nova_compute[227313]: 2026-01-26 18:37:13.922 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:37:14 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:37:14 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3425023570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:37:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:14Z|00211|binding|INFO|Releasing lport e1b2dc53-fbcf-4b93-9880-910123b0b71c from this chassis (sb_readonly=0)
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.406 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:14 np0005596062 NetworkManager[48993]: <info>  [1769452634.4086] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 26 13:37:14 np0005596062 NetworkManager[48993]: <info>  [1769452634.4095] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Jan 26 13:37:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:14Z|00212|binding|INFO|Releasing lport e1b2dc53-fbcf-4b93-9880-910123b0b71c from this chassis (sb_readonly=0)
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.415 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.437 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.446 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.471 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.499 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.499 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.499 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.500 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:37:14 np0005596062 nova_compute[227313]: 2026-01-26 18:37:14.514 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:37:14 np0005596062 podman[257649]: 2026-01-26 18:37:14.881481277 +0000 UTC m=+0.076125661 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 13:37:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:15.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:15 np0005596062 nova_compute[227313]: 2026-01-26 18:37:15.413 227317 DEBUG nova.compute.manager [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-changed-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:15 np0005596062 nova_compute[227313]: 2026-01-26 18:37:15.414 227317 DEBUG nova.compute.manager [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Refreshing instance network info cache due to event network-changed-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:37:15 np0005596062 nova_compute[227313]: 2026-01-26 18:37:15.414 227317 DEBUG oslo_concurrency.lockutils [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:37:15 np0005596062 nova_compute[227313]: 2026-01-26 18:37:15.414 227317 DEBUG oslo_concurrency.lockutils [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:37:15 np0005596062 nova_compute[227313]: 2026-01-26 18:37:15.415 227317 DEBUG nova.network.neutron [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Refreshing network info cache for port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:37:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:15.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:17 np0005596062 nova_compute[227313]: 2026-01-26 18:37:17.216 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:17 np0005596062 ceph-mon[77178]: Health check update: 10 slow ops, oldest one blocked for 67 sec, mon.compute-1 has slow ops (SLOW_OPS)
Jan 26 13:37:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:17.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:17 np0005596062 nova_compute[227313]: 2026-01-26 18:37:17.637 227317 DEBUG nova.network.neutron [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updated VIF entry in instance network info cache for port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:37:17 np0005596062 nova_compute[227313]: 2026-01-26 18:37:17.638 227317 DEBUG nova.network.neutron [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:37:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:17.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:18 np0005596062 nova_compute[227313]: 2026-01-26 18:37:18.904 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:19.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:19 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:19Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:b9:47 10.100.0.12
Jan 26 13:37:19 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:19Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:b9:47 10.100.0.12
Jan 26 13:37:19 np0005596062 ceph-mon[77178]: Health check cleared: SLOW_OPS (was: 10 slow ops, oldest one blocked for 67 sec, mon.compute-1 has slow ops)
Jan 26 13:37:19 np0005596062 ceph-mon[77178]: Cluster is now healthy
Jan 26 13:37:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:19.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:19 np0005596062 nova_compute[227313]: 2026-01-26 18:37:19.874 227317 DEBUG oslo_concurrency.lockutils [req-1442fa77-178c-4078-9dcd-e2d0b2b33639 req-5709e19f-149b-46f1-8dfc-e5aed73ba184 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:37:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:21.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:21.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:22 np0005596062 nova_compute[227313]: 2026-01-26 18:37:22.219 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:37:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:23.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:37:23 np0005596062 nova_compute[227313]: 2026-01-26 18:37:23.491 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:23 np0005596062 nova_compute[227313]: 2026-01-26 18:37:23.491 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:37:23 np0005596062 nova_compute[227313]: 2026-01-26 18:37:23.491 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:37:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:37:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:37:23 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:37:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:23.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:23 np0005596062 nova_compute[227313]: 2026-01-26 18:37:23.907 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:25.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:25.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.575 227317 INFO nova.compute.manager [None req-ccdb2276-235c-4ea0-b588-7872408cb3ca ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Get console output#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.580 254751 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 13:37:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:26 np0005596062 podman[257856]: 2026-01-26 18:37:26.869286757 +0000 UTC m=+0.083737564 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.908 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.908 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.909 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.909 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 46f89010-5c5d-4c32-ba88-951b6d640927 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.958 227317 DEBUG oslo_concurrency.lockutils [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.959 227317 DEBUG oslo_concurrency.lockutils [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.960 227317 INFO nova.compute.manager [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Rebooting instance#033[00m
Jan 26 13:37:26 np0005596062 nova_compute[227313]: 2026-01-26 18:37:26.983 227317 DEBUG oslo_concurrency.lockutils [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:37:27 np0005596062 nova_compute[227313]: 2026-01-26 18:37:27.221 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:27.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:27.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:28 np0005596062 nova_compute[227313]: 2026-01-26 18:37:28.910 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:29.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:29.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:37:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.268 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.296 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.297 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.297 227317 DEBUG oslo_concurrency.lockutils [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquired lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.297 227317 DEBUG nova.network.neutron [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.298 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.299 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.299 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.299 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.299 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.299 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:31 np0005596062 nova_compute[227313]: 2026-01-26 18:37:31.300 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:37:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 13:37:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:31.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 13:37:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:31.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:32 np0005596062 nova_compute[227313]: 2026-01-26 18:37:32.224 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:33 np0005596062 nova_compute[227313]: 2026-01-26 18:37:33.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:33 np0005596062 nova_compute[227313]: 2026-01-26 18:37:33.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:37:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:33.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:33 np0005596062 nova_compute[227313]: 2026-01-26 18:37:33.657 227317 DEBUG nova.network.neutron [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:37:33 np0005596062 nova_compute[227313]: 2026-01-26 18:37:33.697 227317 DEBUG oslo_concurrency.lockutils [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Releasing lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:37:33 np0005596062 nova_compute[227313]: 2026-01-26 18:37:33.699 227317 DEBUG nova.compute.manager [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:33.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:33 np0005596062 nova_compute[227313]: 2026-01-26 18:37:33.913 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:35.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:35.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:36 np0005596062 kernel: tapec4b7772-c4 (unregistering): left promiscuous mode
Jan 26 13:37:36 np0005596062 NetworkManager[48993]: <info>  [1769452656.1034] device (tapec4b7772-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00213|binding|INFO|Releasing lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 from this chassis (sb_readonly=0)
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.114 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00214|binding|INFO|Setting lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 down in Southbound
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00215|binding|INFO|Removing iface tapec4b7772-c4 ovn-installed in OVS
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.117 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.136 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.140 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:b9:47 10.100.0.12'], port_security=['fa:16:3e:d8:b9:47 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46f89010-5c5d-4c32-ba88-951b6d640927', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da815054-d79b-464b-a232-6b8265207d78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f896585a-50c1-4248-9fde-f1b2702fb2aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.142 143929 INFO neutron.agent.ovn.metadata.agent [-] Port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 in datapath 25ae0294-d511-4bdd-8a1f-f103179c52b7 unbound from our chassis#033[00m
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.143 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25ae0294-d511-4bdd-8a1f-f103179c52b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.144 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c72c0a-fa4f-4f9a-a03a-cec0137bcf56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.145 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 namespace which is not needed anymore#033[00m
Jan 26 13:37:36 np0005596062 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 26 13:37:36 np0005596062 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 14.720s CPU time.
Jan 26 13:37:36 np0005596062 systemd-machined[195380]: Machine qemu-19-instance-00000019 terminated.
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.458 227317 DEBUG nova.compute.manager [req-64da3c8b-f2b1-411c-850e-d36f3c3ac088 req-9c32504f-c45c-4f42-9305-027ecac353ee 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-unplugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.459 227317 DEBUG oslo_concurrency.lockutils [req-64da3c8b-f2b1-411c-850e-d36f3c3ac088 req-9c32504f-c45c-4f42-9305-027ecac353ee 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.460 227317 DEBUG oslo_concurrency.lockutils [req-64da3c8b-f2b1-411c-850e-d36f3c3ac088 req-9c32504f-c45c-4f42-9305-027ecac353ee 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.460 227317 DEBUG oslo_concurrency.lockutils [req-64da3c8b-f2b1-411c-850e-d36f3c3ac088 req-9c32504f-c45c-4f42-9305-027ecac353ee 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.461 227317 DEBUG nova.compute.manager [req-64da3c8b-f2b1-411c-850e-d36f3c3ac088 req-9c32504f-c45c-4f42-9305-027ecac353ee 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-unplugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.461 227317 WARNING nova.compute.manager [req-64da3c8b-f2b1-411c-850e-d36f3c3ac088 req-9c32504f-c45c-4f42-9305-027ecac353ee 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received unexpected event network-vif-unplugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with vm_state active and task_state reboot_started.#033[00m
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.555 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.557 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:37:36 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [NOTICE]   (257588) : haproxy version is 2.8.14-c23fe91
Jan 26 13:37:36 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [NOTICE]   (257588) : path to executable is /usr/sbin/haproxy
Jan 26 13:37:36 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [WARNING]  (257588) : Exiting Master process...
Jan 26 13:37:36 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [ALERT]    (257588) : Current worker (257590) exited with code 143 (Terminated)
Jan 26 13:37:36 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[257584]: [WARNING]  (257588) : All workers exited. Exiting... (0)
Jan 26 13:37:36 np0005596062 systemd[1]: libpod-da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94.scope: Deactivated successfully.
Jan 26 13:37:36 np0005596062 podman[257961]: 2026-01-26 18:37:36.708316212 +0000 UTC m=+0.453144949 container died da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.835 227317 INFO nova.virt.libvirt.driver [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance shutdown successfully.#033[00m
Jan 26 13:37:36 np0005596062 kernel: tapec4b7772-c4: entered promiscuous mode
Jan 26 13:37:36 np0005596062 systemd-udevd[257940]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.903 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00216|binding|INFO|Claiming lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for this chassis.
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00217|binding|INFO|ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4: Claiming fa:16:3e:d8:b9:47 10.100.0.12
Jan 26 13:37:36 np0005596062 NetworkManager[48993]: <info>  [1769452656.9059] manager: (tapec4b7772-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Jan 26 13:37:36 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:36.915 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:b9:47 10.100.0.12'], port_security=['fa:16:3e:d8:b9:47 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46f89010-5c5d-4c32-ba88-951b6d640927', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'da815054-d79b-464b-a232-6b8265207d78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f896585a-50c1-4248-9fde-f1b2702fb2aa, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00218|binding|INFO|Setting lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 ovn-installed in OVS
Jan 26 13:37:36 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:36Z|00219|binding|INFO|Setting lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 up in Southbound
Jan 26 13:37:36 np0005596062 nova_compute[227313]: 2026-01-26 18:37:36.921 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:36 np0005596062 NetworkManager[48993]: <info>  [1769452656.9230] device (tapec4b7772-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:37:36 np0005596062 NetworkManager[48993]: <info>  [1769452656.9244] device (tapec4b7772-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:37:36 np0005596062 systemd-machined[195380]: New machine qemu-20-instance-00000019.
Jan 26 13:37:36 np0005596062 systemd[1]: Started Virtual Machine qemu-20-instance-00000019.
Jan 26 13:37:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:36 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94-userdata-shm.mount: Deactivated successfully.
Jan 26 13:37:36 np0005596062 systemd[1]: var-lib-containers-storage-overlay-0a04e4c16ed83638fad151ffd6b3fe1e0b9cd792301c093ec50f3884d90650c0-merged.mount: Deactivated successfully.
Jan 26 13:37:37 np0005596062 podman[257961]: 2026-01-26 18:37:37.016931183 +0000 UTC m=+0.761759920 container cleanup da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:37:37 np0005596062 systemd[1]: libpod-conmon-da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94.scope: Deactivated successfully.
Jan 26 13:37:37 np0005596062 podman[258024]: 2026-01-26 18:37:37.102827635 +0000 UTC m=+0.052173713 container remove da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.116 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[59282bf9-8b5f-4f99-a1df-5190eb19bdaf]: (4, ('Mon Jan 26 06:37:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 (da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94)\nda9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94\nMon Jan 26 06:37:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 (da9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94)\nda9f79e7b2f382f1adf1e1a2f44cba9741f38dfd5cefc638a394de4ffe951c94\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.118 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0c0f11-6ee3-4eb6-81d8-c1216464de92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.119 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25ae0294-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:37 np0005596062 nova_compute[227313]: 2026-01-26 18:37:37.121 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:37 np0005596062 kernel: tap25ae0294-d0: left promiscuous mode
Jan 26 13:37:37 np0005596062 nova_compute[227313]: 2026-01-26 18:37:37.135 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.137 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fe520e6e-0efa-4d5b-a628-5f4e87d7678f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.153 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[09b62aa0-44a9-4b9b-b1ba-f9ae1f11161b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.155 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9e15b71c-8fbd-4e5e-919e-34ea88c60523]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.170 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ea5a1e-fc2f-43d9-b9c5-fabdf5b48011]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638626, 'reachable_time': 31605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258040, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 systemd[1]: run-netns-ovnmeta\x2d25ae0294\x2dd511\x2d4bdd\x2d8a1f\x2df103179c52b7.mount: Deactivated successfully.
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.173 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.173 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4ab83f-153f-42a8-9445-e235a8fd426c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.176 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.177 143929 INFO neutron.agent.ovn.metadata.agent [-] Port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 in datapath 25ae0294-d511-4bdd-8a1f-f103179c52b7 unbound from our chassis#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.178 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25ae0294-d511-4bdd-8a1f-f103179c52b7#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.188 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4a26d94c-9d02-4d0f-abdc-aca39de80dcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.189 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap25ae0294-d1 in ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.190 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap25ae0294-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.190 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9f60f047-a262-4618-a657-68d86a506453]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.191 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd36e42-5405-44d8-8c81-9f830c42749f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.207 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc745f4-ec05-4ecd-9f53-f59a25f7cefc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 nova_compute[227313]: 2026-01-26 18:37:37.226 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.231 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[88944019-36ea-4e50-87a0-3811f6e78b85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.265 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[048dcfd6-58ba-4e62-ac86-5eab76ebd25d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.276 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fccc979c-877d-47f6-914d-8a6ad134deef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 NetworkManager[48993]: <info>  [1769452657.2783] manager: (tap25ae0294-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.308 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[5697b81f-17d5-4a05-a59d-70afca723f46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.311 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8390d0-2d6e-4cd0-96a2-ceb95c169b18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 NetworkManager[48993]: <info>  [1769452657.3346] device (tap25ae0294-d0): carrier: link connected
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.342 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[bab616cc-1738-40d7-a15d-bad96a0341eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:37.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.365 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[977fff53-2e5a-4752-b058-1afe1566e666]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25ae0294-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:aa:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641738, 'reachable_time': 16724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258067, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.385 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[eef3bc59-5abb-4429-9d9d-9a7777fab39f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:aa50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 641738, 'tstamp': 641738}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258068, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.404 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad185ce-084b-4248-813b-8d7bc862446f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25ae0294-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:aa:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641738, 'reachable_time': 16724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258069, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.444 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[60f0e557-2a51-4d41-be48-b89ea1b9f05f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.502 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0208ebba-62ac-45fa-b262-d306e80d9131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.504 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25ae0294-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.504 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.505 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25ae0294-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:37 np0005596062 nova_compute[227313]: 2026-01-26 18:37:37.507 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:37 np0005596062 kernel: tap25ae0294-d0: entered promiscuous mode
Jan 26 13:37:37 np0005596062 NetworkManager[48993]: <info>  [1769452657.5094] manager: (tap25ae0294-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.512 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25ae0294-d0, col_values=(('external_ids', {'iface-id': 'e1b2dc53-fbcf-4b93-9880-910123b0b71c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:37 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:37Z|00220|binding|INFO|Releasing lport e1b2dc53-fbcf-4b93-9880-910123b0b71c from this chassis (sb_readonly=0)
Jan 26 13:37:37 np0005596062 nova_compute[227313]: 2026-01-26 18:37:37.514 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.517 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/25ae0294-d511-4bdd-8a1f-f103179c52b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/25ae0294-d511-4bdd-8a1f-f103179c52b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.518 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0967d50a-073b-4a8e-93c4-28857ffc5e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.519 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-25ae0294-d511-4bdd-8a1f-f103179c52b7
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/25ae0294-d511-4bdd-8a1f-f103179c52b7.pid.haproxy
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 25ae0294-d511-4bdd-8a1f-f103179c52b7
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:37:37 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:37.520 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'env', 'PROCESS_TAG=haproxy-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/25ae0294-d511-4bdd-8a1f-f103179c52b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:37:37 np0005596062 nova_compute[227313]: 2026-01-26 18:37:37.525 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:37.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:37 np0005596062 podman[258109]: 2026-01-26 18:37:37.897134192 +0000 UTC m=+0.056129938 container create b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 26 13:37:37 np0005596062 systemd[1]: Started libpod-conmon-b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285.scope.
Jan 26 13:37:37 np0005596062 podman[258109]: 2026-01-26 18:37:37.866903866 +0000 UTC m=+0.025899612 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:37:37 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:37:37 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58d2946065794f132b428d075ba1241fe2200c96469a7f1cf8aa924cd91d9659/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.006 227317 DEBUG nova.virt.libvirt.host [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Removed pending event for 46f89010-5c5d-4c32-ba88-951b6d640927 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.007 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452658.0055363, 46f89010-5c5d-4c32-ba88-951b6d640927 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.007 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:37:38 np0005596062 podman[258109]: 2026-01-26 18:37:38.006848069 +0000 UTC m=+0.165843835 container init b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 13:37:38 np0005596062 podman[258109]: 2026-01-26 18:37:38.015198821 +0000 UTC m=+0.174194567 container start b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.015 227317 INFO nova.virt.libvirt.driver [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance running successfully.#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.016 227317 INFO nova.virt.libvirt.driver [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance soft rebooted successfully.#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.016 227317 DEBUG nova.compute.manager [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:38 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [NOTICE]   (258163) : New worker (258165) forked
Jan 26 13:37:38 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [NOTICE]   (258163) : Loading success.
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.041 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.045 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.082 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.083 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452658.0101619, 46f89010-5c5d-4c32-ba88-951b6d640927 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.083 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] VM Started (Lifecycle Event)#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.105 227317 DEBUG oslo_concurrency.lockutils [None req-9beba710-9295-4be3-83e9-8a2e7f8a8672 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 11.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.106 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.110 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.543 227317 DEBUG nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.543 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.544 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.545 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.545 227317 DEBUG nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.545 227317 WARNING nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received unexpected event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.545 227317 DEBUG nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.545 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.546 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.546 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.546 227317 DEBUG nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.546 227317 WARNING nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received unexpected event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.546 227317 DEBUG nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.547 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.547 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.547 227317 DEBUG oslo_concurrency.lockutils [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.547 227317 DEBUG nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.547 227317 WARNING nova.compute.manager [req-1579c412-8c6a-4993-a08c-100679b3be09 req-9d543a2d-8c96-4c3e-af3d-5c49a653329a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received unexpected event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:37:38 np0005596062 nova_compute[227313]: 2026-01-26 18:37:38.915 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:39 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:39.179 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:37:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:37:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:39.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:37:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:37:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:41.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:42 np0005596062 nova_compute[227313]: 2026-01-26 18:37:42.271 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:43.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:43 np0005596062 nova_compute[227313]: 2026-01-26 18:37:43.917 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:45.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:45.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:45 np0005596062 podman[258228]: 2026-01-26 18:37:45.852585763 +0000 UTC m=+0.062170579 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 13:37:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:47 np0005596062 nova_compute[227313]: 2026-01-26 18:37:47.270 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:47.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:47.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:48 np0005596062 nova_compute[227313]: 2026-01-26 18:37:48.919 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:49.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:49.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:51 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:51Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:b9:47 10.100.0.12
Jan 26 13:37:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:51.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:51.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:52 np0005596062 nova_compute[227313]: 2026-01-26 18:37:52.271 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:53.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:37:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:53.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:37:53 np0005596062 nova_compute[227313]: 2026-01-26 18:37:53.966 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:55.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:37:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:55.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:37:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:37:57 np0005596062 nova_compute[227313]: 2026-01-26 18:37:57.275 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:57 np0005596062 nova_compute[227313]: 2026-01-26 18:37:57.305 227317 INFO nova.compute.manager [None req-7b9cd22f-ca18-446d-97ba-4c955b3bed33 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Get console output#033[00m
Jan 26 13:37:57 np0005596062 nova_compute[227313]: 2026-01-26 18:37:57.310 254751 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 13:37:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:57.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:57.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:57 np0005596062 podman[258253]: 2026-01-26 18:37:57.910021691 +0000 UTC m=+0.114437813 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:37:58 np0005596062 nova_compute[227313]: 2026-01-26 18:37:58.968 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:37:59.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.662 227317 DEBUG nova.compute.manager [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-changed-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.663 227317 DEBUG nova.compute.manager [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Refreshing instance network info cache due to event network-changed-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.663 227317 DEBUG oslo_concurrency.lockutils [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.664 227317 DEBUG oslo_concurrency.lockutils [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.664 227317 DEBUG nova.network.neutron [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Refreshing network info cache for port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.707 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.707 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.707 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.708 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.708 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.708 227317 INFO nova.compute.manager [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Terminating instance#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.709 227317 DEBUG nova.compute.manager [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:37:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:37:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:37:59 np0005596062 kernel: tapec4b7772-c4 (unregistering): left promiscuous mode
Jan 26 13:37:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:37:59.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:37:59 np0005596062 NetworkManager[48993]: <info>  [1769452679.7766] device (tapec4b7772-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:37:59 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:59Z|00221|binding|INFO|Releasing lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 from this chassis (sb_readonly=0)
Jan 26 13:37:59 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:59Z|00222|binding|INFO|Setting lport ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 down in Southbound
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.799 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:59 np0005596062 ovn_controller[133984]: 2026-01-26T18:37:59Z|00223|binding|INFO|Removing iface tapec4b7772-c4 ovn-installed in OVS
Jan 26 13:37:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:59.807 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:b9:47 10.100.0.12'], port_security=['fa:16:3e:d8:b9:47 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46f89010-5c5d-4c32-ba88-951b6d640927', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'da815054-d79b-464b-a232-6b8265207d78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f896585a-50c1-4248-9fde-f1b2702fb2aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:37:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:59.808 143929 INFO neutron.agent.ovn.metadata.agent [-] Port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 in datapath 25ae0294-d511-4bdd-8a1f-f103179c52b7 unbound from our chassis#033[00m
Jan 26 13:37:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:59.809 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25ae0294-d511-4bdd-8a1f-f103179c52b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:37:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:59.810 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f9fd8fa3-50c4-42f8-94f9-98506b541c13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:37:59 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:37:59.810 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 namespace which is not needed anymore#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.824 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:59 np0005596062 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 26 13:37:59 np0005596062 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000019.scope: Consumed 13.900s CPU time.
Jan 26 13:37:59 np0005596062 systemd-machined[195380]: Machine qemu-20-instance-00000019 terminated.
Jan 26 13:37:59 np0005596062 kernel: tapec4b7772-c4: entered promiscuous mode
Jan 26 13:37:59 np0005596062 kernel: tapec4b7772-c4 (unregistering): left promiscuous mode
Jan 26 13:37:59 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [NOTICE]   (258163) : haproxy version is 2.8.14-c23fe91
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.931 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:59 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [NOTICE]   (258163) : path to executable is /usr/sbin/haproxy
Jan 26 13:37:59 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [WARNING]  (258163) : Exiting Master process...
Jan 26 13:37:59 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [ALERT]    (258163) : Current worker (258165) exited with code 143 (Terminated)
Jan 26 13:37:59 np0005596062 neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7[258158]: [WARNING]  (258163) : All workers exited. Exiting... (0)
Jan 26 13:37:59 np0005596062 systemd[1]: libpod-b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285.scope: Deactivated successfully.
Jan 26 13:37:59 np0005596062 podman[258304]: 2026-01-26 18:37:59.940463971 +0000 UTC m=+0.051985378 container died b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.945 227317 INFO nova.virt.libvirt.driver [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Instance destroyed successfully.#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.945 227317 DEBUG nova.objects.instance [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'resources' on Instance uuid 46f89010-5c5d-4c32-ba88-951b6d640927 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.960 227317 DEBUG nova.virt.libvirt.vif [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:35:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2096981336',display_name='tempest-TestNetworkAdvancedServerOps-server-2096981336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2096981336',id=25,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQJz/YobTJceZvzJVuWmJsMcwBHeXUK3qg9BboX4DIZ5bVn3L/CtROKlHp/+NBsJy5WBPfnAbNkl+SqE4ICwsBMnEMwqWyuIpclQAnUTz3DIA/5r+AyFQgQJNuNY5sFTA==',key_name='tempest-TestNetworkAdvancedServerOps-185858677',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:37:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-7t4115p4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:37:38Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=46f89010-5c5d-4c32-ba88-951b6d640927,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.960 227317 DEBUG nova.network.os_vif_util [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.961 227317 DEBUG nova.network.os_vif_util [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.962 227317 DEBUG os_vif [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.964 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.964 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec4b7772-c4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.965 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.966 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:37:59 np0005596062 nova_compute[227313]: 2026-01-26 18:37:59.969 227317 INFO os_vif [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:b9:47,bridge_name='br-int',has_traffic_filtering=True,id=ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4,network=Network(25ae0294-d511-4bdd-8a1f-f103179c52b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4b7772-c4')#033[00m
Jan 26 13:38:00 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285-userdata-shm.mount: Deactivated successfully.
Jan 26 13:38:00 np0005596062 systemd[1]: var-lib-containers-storage-overlay-58d2946065794f132b428d075ba1241fe2200c96469a7f1cf8aa924cd91d9659-merged.mount: Deactivated successfully.
Jan 26 13:38:00 np0005596062 podman[258304]: 2026-01-26 18:38:00.050759783 +0000 UTC m=+0.162281170 container cleanup b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 13:38:00 np0005596062 systemd[1]: libpod-conmon-b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285.scope: Deactivated successfully.
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.113 227317 DEBUG nova.compute.manager [req-412d539f-1605-4082-b098-27c1daa056cb req-6db9bf7b-b4dd-4843-8538-5799a125c1e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-unplugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.114 227317 DEBUG oslo_concurrency.lockutils [req-412d539f-1605-4082-b098-27c1daa056cb req-6db9bf7b-b4dd-4843-8538-5799a125c1e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.114 227317 DEBUG oslo_concurrency.lockutils [req-412d539f-1605-4082-b098-27c1daa056cb req-6db9bf7b-b4dd-4843-8538-5799a125c1e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.115 227317 DEBUG oslo_concurrency.lockutils [req-412d539f-1605-4082-b098-27c1daa056cb req-6db9bf7b-b4dd-4843-8538-5799a125c1e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.115 227317 DEBUG nova.compute.manager [req-412d539f-1605-4082-b098-27c1daa056cb req-6db9bf7b-b4dd-4843-8538-5799a125c1e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-unplugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.115 227317 DEBUG nova.compute.manager [req-412d539f-1605-4082-b098-27c1daa056cb req-6db9bf7b-b4dd-4843-8538-5799a125c1e7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-unplugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:38:00 np0005596062 podman[258361]: 2026-01-26 18:38:00.121069219 +0000 UTC m=+0.048504055 container remove b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.126 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4d730a2a-098b-45c9-85c9-df2d8cc4a901]: (4, ('Mon Jan 26 06:37:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 (b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285)\nb0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285\nMon Jan 26 06:38:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 (b0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285)\nb0ee1393afba3ab7167ba7abf327521dba566a9d6de052a8b134421ae9ded285\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.127 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d3fba1d3-a66d-48eb-b0ad-3459ea44ec7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.128 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25ae0294-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:38:00 np0005596062 kernel: tap25ae0294-d0: left promiscuous mode
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.130 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.142 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.144 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[74d10776-4085-4104-a2f4-923d4da05260]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.160 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa5f87b-3faf-4062-9300-5b3e3d6706f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.161 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d20d76-69ae-4de3-a26b-9351f862ddd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.176 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1ed157-cfc6-4af6-9798-fed762799ac5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641730, 'reachable_time': 23723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258376, 'error': None, 'target': 'ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.180 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-25ae0294-d511-4bdd-8a1f-f103179c52b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:38:00 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:00.180 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[e1fa2546-8a94-4388-9d3c-1b023f658933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:38:00 np0005596062 systemd[1]: run-netns-ovnmeta\x2d25ae0294\x2dd511\x2d4bdd\x2d8a1f\x2df103179c52b7.mount: Deactivated successfully.
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.435 227317 INFO nova.virt.libvirt.driver [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Deleting instance files /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927_del#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.437 227317 INFO nova.virt.libvirt.driver [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Deletion of /var/lib/nova/instances/46f89010-5c5d-4c32-ba88-951b6d640927_del complete#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.502 227317 INFO nova.compute.manager [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Took 0.79 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.502 227317 DEBUG oslo.service.loopingcall [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.503 227317 DEBUG nova.compute.manager [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:38:00 np0005596062 nova_compute[227313]: 2026-01-26 18:38:00.503 227317 DEBUG nova.network.neutron [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:38:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:01.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:01.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.892 227317 DEBUG nova.network.neutron [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updated VIF entry in instance network info cache for port ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.893 227317 DEBUG nova.network.neutron [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [{"id": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "address": "fa:16:3e:d8:b9:47", "network": {"id": "25ae0294-d511-4bdd-8a1f-f103179c52b7", "bridge": "br-int", "label": "tempest-network-smoke--371014775", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4b7772-c4", "ovs_interfaceid": "ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.920 227317 DEBUG nova.network.neutron [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.922 227317 DEBUG oslo_concurrency.lockutils [req-6202cd03-6959-4966-b166-7c47d33e2012 req-08c37282-1dc7-4b5b-af83-b129bcdbcf76 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-46f89010-5c5d-4c32-ba88-951b6d640927" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.938 227317 INFO nova.compute.manager [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Took 1.43 seconds to deallocate network for instance.#033[00m
Jan 26 13:38:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.985 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:38:01 np0005596062 nova_compute[227313]: 2026-01-26 18:38:01.986 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.000 227317 DEBUG nova.compute.manager [req-3f960e10-85bc-4ce0-8e07-d36c6258c045 req-79620b53-33c1-4465-a95e-c170cfff4b23 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-deleted-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.156 227317 DEBUG nova.scheduler.client.report [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.241 227317 DEBUG nova.scheduler.client.report [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.241 227317 DEBUG nova.compute.provider_tree [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.276 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.294 227317 DEBUG nova.compute.manager [req-5bb49e0f-7f87-45b6-b2c1-002d39185034 req-397b7384-2d19-4105-9933-ff1975042535 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.294 227317 DEBUG oslo_concurrency.lockutils [req-5bb49e0f-7f87-45b6-b2c1-002d39185034 req-397b7384-2d19-4105-9933-ff1975042535 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.295 227317 DEBUG oslo_concurrency.lockutils [req-5bb49e0f-7f87-45b6-b2c1-002d39185034 req-397b7384-2d19-4105-9933-ff1975042535 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.295 227317 DEBUG oslo_concurrency.lockutils [req-5bb49e0f-7f87-45b6-b2c1-002d39185034 req-397b7384-2d19-4105-9933-ff1975042535 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.296 227317 DEBUG nova.compute.manager [req-5bb49e0f-7f87-45b6-b2c1-002d39185034 req-397b7384-2d19-4105-9933-ff1975042535 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] No waiting events found dispatching network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.296 227317 WARNING nova.compute.manager [req-5bb49e0f-7f87-45b6-b2c1-002d39185034 req-397b7384-2d19-4105-9933-ff1975042535 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Received unexpected event network-vif-plugged-ec4b7772-c4c1-4dde-ad2e-9eb6eb0226a4 for instance with vm_state deleted and task_state None.#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.378 227317 DEBUG nova.scheduler.client.report [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.410 227317 DEBUG nova.scheduler.client.report [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.461 227317 DEBUG oslo_concurrency.processutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:38:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:38:02 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1276771210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.893 227317 DEBUG oslo_concurrency.processutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.901 227317 DEBUG nova.compute.provider_tree [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.933 227317 DEBUG nova.scheduler.client.report [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:38:02 np0005596062 nova_compute[227313]: 2026-01-26 18:38:02.966 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:03 np0005596062 nova_compute[227313]: 2026-01-26 18:38:03.027 227317 INFO nova.scheduler.client.report [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Deleted allocations for instance 46f89010-5c5d-4c32-ba88-951b6d640927#033[00m
Jan 26 13:38:03 np0005596062 nova_compute[227313]: 2026-01-26 18:38:03.120 227317 DEBUG oslo_concurrency.lockutils [None req-0d3d3bc0-c745-4d6b-977c-33125cfda17a ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "46f89010-5c5d-4c32-ba88-951b6d640927" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:03.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:03.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:04 np0005596062 nova_compute[227313]: 2026-01-26 18:38:04.968 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:05.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:05.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:07 np0005596062 nova_compute[227313]: 2026-01-26 18:38:07.327 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:38:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:07.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:38:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:07.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:09.187 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:38:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:09.188 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:38:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:09.188 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:09.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:09.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:09 np0005596062 nova_compute[227313]: 2026-01-26 18:38:09.786 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:09 np0005596062 nova_compute[227313]: 2026-01-26 18:38:09.848 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:09 np0005596062 nova_compute[227313]: 2026-01-26 18:38:09.970 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:11.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:11.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:12 np0005596062 nova_compute[227313]: 2026-01-26 18:38:12.329 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:13.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:13.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:14 np0005596062 nova_compute[227313]: 2026-01-26 18:38:14.943 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452679.9424036, 46f89010-5c5d-4c32-ba88-951b6d640927 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:38:14 np0005596062 nova_compute[227313]: 2026-01-26 18:38:14.944 227317 INFO nova.compute.manager [-] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:38:14 np0005596062 nova_compute[227313]: 2026-01-26 18:38:14.973 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.064 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.117 227317 DEBUG nova.compute.manager [None req-343c335b-46ef-4ba2-b294-ef3aaf9a1726 - - - - - -] [instance: 46f89010-5c5d-4c32-ba88-951b6d640927] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.120 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.120 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.120 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.120 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.121 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:38:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:38:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:15.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:38:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:38:15 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/360792319' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.541 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.682 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.683 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4756MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.683 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.683 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.763 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.763 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:38:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:15.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:15 np0005596062 nova_compute[227313]: 2026-01-26 18:38:15.822 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:38:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:38:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/178904069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:38:16 np0005596062 nova_compute[227313]: 2026-01-26 18:38:16.465 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.643s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:38:16 np0005596062 nova_compute[227313]: 2026-01-26 18:38:16.473 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:38:16 np0005596062 nova_compute[227313]: 2026-01-26 18:38:16.497 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:38:16 np0005596062 nova_compute[227313]: 2026-01-26 18:38:16.516 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:38:16 np0005596062 nova_compute[227313]: 2026-01-26 18:38:16.517 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:38:16 np0005596062 podman[258505]: 2026-01-26 18:38:16.846632674 +0000 UTC m=+0.052594744 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 13:38:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:17 np0005596062 nova_compute[227313]: 2026-01-26 18:38:17.331 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:17.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:17.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:19.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:19.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:19 np0005596062 nova_compute[227313]: 2026-01-26 18:38:19.976 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:21.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:21.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:22 np0005596062 nova_compute[227313]: 2026-01-26 18:38:22.373 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:22 np0005596062 nova_compute[227313]: 2026-01-26 18:38:22.498 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:23 np0005596062 nova_compute[227313]: 2026-01-26 18:38:23.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:23 np0005596062 nova_compute[227313]: 2026-01-26 18:38:23.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:38:23 np0005596062 nova_compute[227313]: 2026-01-26 18:38:23.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:38:23 np0005596062 nova_compute[227313]: 2026-01-26 18:38:23.072 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:38:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:38:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:23.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:38:24 np0005596062 nova_compute[227313]: 2026-01-26 18:38:24.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:24 np0005596062 nova_compute[227313]: 2026-01-26 18:38:24.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:24 np0005596062 nova_compute[227313]: 2026-01-26 18:38:24.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:38:24 np0005596062 nova_compute[227313]: 2026-01-26 18:38:24.979 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:25 np0005596062 nova_compute[227313]: 2026-01-26 18:38:25.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:38:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:25.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:38:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:25.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:26 np0005596062 nova_compute[227313]: 2026-01-26 18:38:26.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:27 np0005596062 nova_compute[227313]: 2026-01-26 18:38:27.374 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:27.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:27.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:28 np0005596062 nova_compute[227313]: 2026-01-26 18:38:28.045 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:28 np0005596062 nova_compute[227313]: 2026-01-26 18:38:28.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:28 np0005596062 podman[258581]: 2026-01-26 18:38:28.876435064 +0000 UTC m=+0.085758119 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 13:38:29 np0005596062 nova_compute[227313]: 2026-01-26 18:38:29.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:38:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:38:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:29.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:38:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:29.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:30 np0005596062 nova_compute[227313]: 2026-01-26 18:38:30.030 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:38:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:38:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:38:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:31.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:31.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:32 np0005596062 nova_compute[227313]: 2026-01-26 18:38:32.376 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:33.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:33.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:35 np0005596062 nova_compute[227313]: 2026-01-26 18:38:35.033 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:35.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:35.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:37 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:38:37 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:38:37 np0005596062 nova_compute[227313]: 2026-01-26 18:38:37.378 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:37.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:37.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:38:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:39.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:38:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:39.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:40 np0005596062 nova_compute[227313]: 2026-01-26 18:38:40.036 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:41.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:41.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:42 np0005596062 nova_compute[227313]: 2026-01-26 18:38:42.380 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:43.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:43.636 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:38:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:43.637 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:38:43 np0005596062 nova_compute[227313]: 2026-01-26 18:38:43.637 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:38:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:43.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:38:45 np0005596062 nova_compute[227313]: 2026-01-26 18:38:45.039 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:45.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:45.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:47 np0005596062 nova_compute[227313]: 2026-01-26 18:38:47.381 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:47.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:47.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:47 np0005596062 podman[258846]: 2026-01-26 18:38:47.870645361 +0000 UTC m=+0.079799900 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Jan 26 13:38:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:49.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:49 np0005596062 ovn_controller[133984]: 2026-01-26T18:38:49Z|00224|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 13:38:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:49.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:50 np0005596062 nova_compute[227313]: 2026-01-26 18:38:50.047 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:51.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:51 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:38:51.638 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:38:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:51.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:52 np0005596062 nova_compute[227313]: 2026-01-26 18:38:52.383 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:53.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:53.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:55 np0005596062 nova_compute[227313]: 2026-01-26 18:38:55.050 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:55.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:55.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:38:57 np0005596062 nova_compute[227313]: 2026-01-26 18:38:57.385 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:38:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:57.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:57.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:38:59.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:38:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:38:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:38:59.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:38:59 np0005596062 podman[258873]: 2026-01-26 18:38:59.895618225 +0000 UTC m=+0.101033616 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 13:39:00 np0005596062 nova_compute[227313]: 2026-01-26 18:39:00.051 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:01.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:01.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:02 np0005596062 nova_compute[227313]: 2026-01-26 18:39:02.388 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:03.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:03.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:05 np0005596062 nova_compute[227313]: 2026-01-26 18:39:05.054 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:05.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:05.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:07 np0005596062 nova_compute[227313]: 2026-01-26 18:39:07.390 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:07.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:07.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:39:09.189 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:39:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:39:09.189 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:39:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:39:09.189 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:39:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:09.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:09.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:10 np0005596062 nova_compute[227313]: 2026-01-26 18:39:10.103 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:11.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:11.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:12 np0005596062 nova_compute[227313]: 2026-01-26 18:39:12.392 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:13.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:13.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.085 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.085 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.085 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.085 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.085 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.106 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:15.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:39:15 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3504132756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.511 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.643 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.644 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4750MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.644 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.644 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.701 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.701 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:39:15 np0005596062 nova_compute[227313]: 2026-01-26 18:39:15.719 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:39:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:15.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:39:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1672779633' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:39:16 np0005596062 nova_compute[227313]: 2026-01-26 18:39:16.149 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:39:16 np0005596062 nova_compute[227313]: 2026-01-26 18:39:16.154 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:39:16 np0005596062 nova_compute[227313]: 2026-01-26 18:39:16.168 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:39:16 np0005596062 nova_compute[227313]: 2026-01-26 18:39:16.171 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:39:16 np0005596062 nova_compute[227313]: 2026-01-26 18:39:16.171 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:39:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:17 np0005596062 nova_compute[227313]: 2026-01-26 18:39:17.393 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:17.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:17.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:18 np0005596062 podman[259005]: 2026-01-26 18:39:18.840640659 +0000 UTC m=+0.048703470 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 13:39:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:39:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:19.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:39:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:19.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:20 np0005596062 nova_compute[227313]: 2026-01-26 18:39:20.110 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:21.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:21.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:22 np0005596062 nova_compute[227313]: 2026-01-26 18:39:22.395 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:39:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:23.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:39:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:23.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:25 np0005596062 nova_compute[227313]: 2026-01-26 18:39:25.113 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:25.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:25.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.171 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.172 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.172 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.187 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.187 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.187 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.188 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.188 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:26 np0005596062 nova_compute[227313]: 2026-01-26 18:39:26.188 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:39:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:27 np0005596062 nova_compute[227313]: 2026-01-26 18:39:27.397 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:27.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:27.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:29 np0005596062 nova_compute[227313]: 2026-01-26 18:39:29.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:29 np0005596062 nova_compute[227313]: 2026-01-26 18:39:29.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:29.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:29.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:30 np0005596062 nova_compute[227313]: 2026-01-26 18:39:30.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:39:30 np0005596062 nova_compute[227313]: 2026-01-26 18:39:30.117 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:30 np0005596062 podman[259082]: 2026-01-26 18:39:30.872643318 +0000 UTC m=+0.080875058 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 26 13:39:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:31.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:31.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:39:32.202 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:39:32 np0005596062 nova_compute[227313]: 2026-01-26 18:39:32.202 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:39:32.203 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:39:32 np0005596062 nova_compute[227313]: 2026-01-26 18:39:32.397 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:33.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:39:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:33.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:39:35 np0005596062 nova_compute[227313]: 2026-01-26 18:39:35.120 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:35.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:35.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:37 np0005596062 nova_compute[227313]: 2026-01-26 18:39:37.399 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:37.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:37.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:38 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:39.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:39.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:40 np0005596062 nova_compute[227313]: 2026-01-26 18:39:40.124 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.294934) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452780294971, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2163, "num_deletes": 252, "total_data_size": 5559130, "memory_usage": 5633464, "flush_reason": "Manual Compaction"}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452780316570, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3645250, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44700, "largest_seqno": 46858, "table_properties": {"data_size": 3636751, "index_size": 5121, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18901, "raw_average_key_size": 20, "raw_value_size": 3619119, "raw_average_value_size": 3942, "num_data_blocks": 223, "num_entries": 918, "num_filter_entries": 918, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452601, "oldest_key_time": 1769452601, "file_creation_time": 1769452780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 21672 microseconds, and 8307 cpu microseconds.
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.316604) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3645250 bytes OK
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.316620) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.318085) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.318097) EVENT_LOG_v1 {"time_micros": 1769452780318093, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.318113) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5549545, prev total WAL file size 5549545, number of live WAL files 2.
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.319149) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3559KB)], [87(10MB)]
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452780319192, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 14605409, "oldest_snapshot_seqno": -1}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 6930 keys, 12580039 bytes, temperature: kUnknown
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452780392977, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 12580039, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12532441, "index_size": 29182, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 178510, "raw_average_key_size": 25, "raw_value_size": 12406635, "raw_average_value_size": 1790, "num_data_blocks": 1167, "num_entries": 6930, "num_filter_entries": 6930, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.393213) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 12580039 bytes
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.394447) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 197.7 rd, 170.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.5 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(7.5) write-amplify(3.5) OK, records in: 7459, records dropped: 529 output_compression: NoCompression
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.394464) EVENT_LOG_v1 {"time_micros": 1769452780394456, "job": 54, "event": "compaction_finished", "compaction_time_micros": 73859, "compaction_time_cpu_micros": 26811, "output_level": 6, "num_output_files": 1, "total_output_size": 12580039, "num_input_records": 7459, "num_output_records": 6930, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452780395120, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452780396782, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.319057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.396824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.396828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.396830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.396831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:39:40.396832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/849599790' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:39:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/849599790' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:39:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:39:41.205 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:39:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:41 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:39:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:41.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:41.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:42 np0005596062 nova_compute[227313]: 2026-01-26 18:39:42.401 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:43.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:43.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:45 np0005596062 nova_compute[227313]: 2026-01-26 18:39:45.127 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:45.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:47 np0005596062 nova_compute[227313]: 2026-01-26 18:39:47.402 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:47.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:47 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:39:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:47.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:39:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:49.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:39:49 np0005596062 podman[259349]: 2026-01-26 18:39:49.857169353 +0000 UTC m=+0.063805442 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 13:39:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:49.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:50 np0005596062 nova_compute[227313]: 2026-01-26 18:39:50.131 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:51.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:51.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:52 np0005596062 nova_compute[227313]: 2026-01-26 18:39:52.404 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:53.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:53.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:55 np0005596062 nova_compute[227313]: 2026-01-26 18:39:55.134 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:55.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:55.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:39:56 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:39:57 np0005596062 nova_compute[227313]: 2026-01-26 18:39:57.405 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:39:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:57.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:57.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:39:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:39:59.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:39:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:39:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:39:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:39:59.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:00 np0005596062 nova_compute[227313]: 2026-01-26 18:40:00.137 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:00 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:40:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:01.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:01 np0005596062 podman[259374]: 2026-01-26 18:40:01.905734235 +0000 UTC m=+0.116912187 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:40:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:40:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:40:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:02 np0005596062 nova_compute[227313]: 2026-01-26 18:40:02.407 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:03.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:03.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:05 np0005596062 nova_compute[227313]: 2026-01-26 18:40:05.140 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:05.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:05.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.162 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.162 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.177 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.343 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.344 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.352 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.352 227317 INFO nova.compute.claims [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.489 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.678283) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452806678315, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 505, "num_deletes": 252, "total_data_size": 735684, "memory_usage": 745952, "flush_reason": "Manual Compaction"}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452806683123, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 378062, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 46863, "largest_seqno": 47363, "table_properties": {"data_size": 375464, "index_size": 634, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6987, "raw_average_key_size": 20, "raw_value_size": 370185, "raw_average_value_size": 1091, "num_data_blocks": 28, "num_entries": 339, "num_filter_entries": 339, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452781, "oldest_key_time": 1769452781, "file_creation_time": 1769452806, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 4890 microseconds, and 1912 cpu microseconds.
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.683169) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 378062 bytes OK
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.683188) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.684968) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.684984) EVENT_LOG_v1 {"time_micros": 1769452806684978, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.685002) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 732677, prev total WAL file size 732677, number of live WAL files 2.
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.685564) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353033' seq:72057594037927935, type:22 .. '6D6772737461740031373536' seq:0, type:0; will stop at (end)
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(369KB)], [90(11MB)]
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452806685648, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 12958101, "oldest_snapshot_seqno": -1}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 6760 keys, 9195150 bytes, temperature: kUnknown
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452806766659, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 9195150, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9153208, "index_size": 23910, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 175180, "raw_average_key_size": 25, "raw_value_size": 9034952, "raw_average_value_size": 1336, "num_data_blocks": 947, "num_entries": 6760, "num_filter_entries": 6760, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452806, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.767029) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 9195150 bytes
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.768367) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.7 rd, 113.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 12.0 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(58.6) write-amplify(24.3) OK, records in: 7269, records dropped: 509 output_compression: NoCompression
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.768399) EVENT_LOG_v1 {"time_micros": 1769452806768384, "job": 56, "event": "compaction_finished", "compaction_time_micros": 81161, "compaction_time_cpu_micros": 44068, "output_level": 6, "num_output_files": 1, "total_output_size": 9195150, "num_input_records": 7269, "num_output_records": 6760, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452806768723, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452806772279, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.685391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.772400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.772411) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.772415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.772418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:40:06.772421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4021541589' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.957 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.965 227317 DEBUG nova.compute.provider_tree [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:40:06 np0005596062 nova_compute[227313]: 2026-01-26 18:40:06.984 227317 DEBUG nova.scheduler.client.report [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:40:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.109 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.110 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.240 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.241 227317 DEBUG nova.network.neutron [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.266 227317 INFO nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.408 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.425 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.430 227317 DEBUG nova.policy [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffa1cd7ba9e543f78f2ef48c2a7a67a2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '301bad5c2066428fa7f214024672bf92', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:40:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:40:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:07.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.612 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.613 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.614 227317 INFO nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Creating image(s)#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.644 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.671 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.707 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.711 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.779 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.780 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.780 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.781 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.805 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:07 np0005596062 nova_compute[227313]: 2026-01-26 18:40:07.808 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 b81e40ad-cba8-4851-8245-5c3eb983b479_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:07.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.070 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 b81e40ad-cba8-4851-8245-5c3eb983b479_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.159 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] resizing rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.276 227317 DEBUG nova.objects.instance [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'migration_context' on Instance uuid b81e40ad-cba8-4851-8245-5c3eb983b479 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.296 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.296 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Ensure instance console log exists: /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.297 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.297 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.298 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:08 np0005596062 nova_compute[227313]: 2026-01-26 18:40:08.901 227317 DEBUG nova.network.neutron [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Successfully created port: 2e588806-3c53-401a-90f3-537e4176dcfe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:40:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:09.190 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:09.191 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:09.191 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:09.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:09.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.144 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:10.437 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.437 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:10.438 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.519 227317 DEBUG nova.network.neutron [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Successfully updated port: 2e588806-3c53-401a-90f3-537e4176dcfe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.534 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.534 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.534 227317 DEBUG nova.network.neutron [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.602 227317 DEBUG nova.compute.manager [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-changed-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.603 227317 DEBUG nova.compute.manager [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Refreshing instance network info cache due to event network-changed-2e588806-3c53-401a-90f3-537e4176dcfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.603 227317 DEBUG oslo_concurrency.lockutils [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:10 np0005596062 nova_compute[227313]: 2026-01-26 18:40:10.662 227317 DEBUG nova.network.neutron [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:40:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:11.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.511 227317 DEBUG nova.network.neutron [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.552 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.553 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Instance network_info: |[{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.553 227317 DEBUG oslo_concurrency.lockutils [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.553 227317 DEBUG nova.network.neutron [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Refreshing network info cache for port 2e588806-3c53-401a-90f3-537e4176dcfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.556 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Start _get_guest_xml network_info=[{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.560 227317 WARNING nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.565 227317 DEBUG nova.virt.libvirt.host [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.566 227317 DEBUG nova.virt.libvirt.host [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.571 227317 DEBUG nova.virt.libvirt.host [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.571 227317 DEBUG nova.virt.libvirt.host [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.573 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.573 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.574 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.574 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.574 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.575 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.575 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.575 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.575 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.575 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.576 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.576 227317 DEBUG nova.virt.hardware [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:40:11 np0005596062 nova_compute[227313]: 2026-01-26 18:40:11.579 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:11.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:40:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1478263716' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.034 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.063 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.067 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.410 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:40:12 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4226369101' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.531 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.534 227317 DEBUG nova.virt.libvirt.vif [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-766514569',display_name='tempest-TestNetworkAdvancedServerOps-server-766514569',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-766514569',id=27,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7uQVm9s7C+OqbAh1CIPBxJi+6AkyPpWOPYYV7DcXbtYqg7663H86MBmiolT3Uacef2LD9/V7P8RfgEuQwZCVENs2yHMAD4P9rcdlzFL0K8Hhq6UoTOylf5rcW9T4i1Qg==',key_name='tempest-TestNetworkAdvancedServerOps-706838647',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-rq7teih3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:40:07Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b81e40ad-cba8-4851-8245-5c3eb983b479,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.535 227317 DEBUG nova.network.os_vif_util [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.537 227317 DEBUG nova.network.os_vif_util [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.539 227317 DEBUG nova.objects.instance [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'pci_devices' on Instance uuid b81e40ad-cba8-4851-8245-5c3eb983b479 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.559 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <uuid>b81e40ad-cba8-4851-8245-5c3eb983b479</uuid>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <name>instance-0000001b</name>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-766514569</nova:name>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:40:11</nova:creationTime>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:user uuid="ffa1cd7ba9e543f78f2ef48c2a7a67a2">tempest-TestNetworkAdvancedServerOps-1357272614-project-member</nova:user>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:project uuid="301bad5c2066428fa7f214024672bf92">tempest-TestNetworkAdvancedServerOps-1357272614</nova:project>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <nova:port uuid="2e588806-3c53-401a-90f3-537e4176dcfe">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <entry name="serial">b81e40ad-cba8-4851-8245-5c3eb983b479</entry>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <entry name="uuid">b81e40ad-cba8-4851-8245-5c3eb983b479</entry>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/b81e40ad-cba8-4851-8245-5c3eb983b479_disk">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/b81e40ad-cba8-4851-8245-5c3eb983b479_disk.config">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:24:50:d1"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <target dev="tap2e588806-3c"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/console.log" append="off"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:40:12 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:40:12 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:40:12 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:40:12 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.561 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Preparing to wait for external event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.561 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.561 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.562 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.562 227317 DEBUG nova.virt.libvirt.vif [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-766514569',display_name='tempest-TestNetworkAdvancedServerOps-server-766514569',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-766514569',id=27,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7uQVm9s7C+OqbAh1CIPBxJi+6AkyPpWOPYYV7DcXbtYqg7663H86MBmiolT3Uacef2LD9/V7P8RfgEuQwZCVENs2yHMAD4P9rcdlzFL0K8Hhq6UoTOylf5rcW9T4i1Qg==',key_name='tempest-TestNetworkAdvancedServerOps-706838647',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-rq7teih3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:40:07Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b81e40ad-cba8-4851-8245-5c3eb983b479,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.562 227317 DEBUG nova.network.os_vif_util [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.563 227317 DEBUG nova.network.os_vif_util [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.564 227317 DEBUG os_vif [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.564 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.565 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.565 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.568 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.568 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e588806-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.569 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e588806-3c, col_values=(('external_ids', {'iface-id': '2e588806-3c53-401a-90f3-537e4176dcfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:50:d1', 'vm-uuid': 'b81e40ad-cba8-4851-8245-5c3eb983b479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.570 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:12 np0005596062 NetworkManager[48993]: <info>  [1769452812.5713] manager: (tap2e588806-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.573 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.579 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.580 227317 INFO os_vif [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c')#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.695 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.695 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.695 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] No VIF found with MAC fa:16:3e:24:50:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.696 227317 INFO nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Using config drive#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.723 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.789 227317 DEBUG nova.network.neutron [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updated VIF entry in instance network info cache for port 2e588806-3c53-401a-90f3-537e4176dcfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.790 227317 DEBUG nova.network.neutron [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:12 np0005596062 nova_compute[227313]: 2026-01-26 18:40:12.879 227317 DEBUG oslo_concurrency.lockutils [req-87266c4a-1d6e-40ed-8548-2f40a46be16a req-038dada6-8c87-424c-b77a-e78469a4f905 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.083 227317 INFO nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Creating config drive at /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/disk.config#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.091 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_zuhuqdj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.241 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_zuhuqdj" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.268 227317 DEBUG nova.storage.rbd_utils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] rbd image b81e40ad-cba8-4851-8245-5c3eb983b479_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.272 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/disk.config b81e40ad-cba8-4851-8245-5c3eb983b479_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.440 227317 DEBUG oslo_concurrency.processutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/disk.config b81e40ad-cba8-4851-8245-5c3eb983b479_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.441 227317 INFO nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Deleting local config drive /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/disk.config because it was imported into RBD.#033[00m
Jan 26 13:40:13 np0005596062 kernel: tap2e588806-3c: entered promiscuous mode
Jan 26 13:40:13 np0005596062 NetworkManager[48993]: <info>  [1769452813.4987] manager: (tap2e588806-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.498 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:13Z|00225|binding|INFO|Claiming lport 2e588806-3c53-401a-90f3-537e4176dcfe for this chassis.
Jan 26 13:40:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:13Z|00226|binding|INFO|2e588806-3c53-401a-90f3-537e4176dcfe: Claiming fa:16:3e:24:50:d1 10.100.0.7
Jan 26 13:40:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:13.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:13 np0005596062 systemd-machined[195380]: New machine qemu-21-instance-0000001b.
Jan 26 13:40:13 np0005596062 systemd[1]: Started Virtual Machine qemu-21-instance-0000001b.
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.578 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:13Z|00227|binding|INFO|Setting lport 2e588806-3c53-401a-90f3-537e4176dcfe ovn-installed in OVS
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.584 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:13 np0005596062 systemd-udevd[259778]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:40:13 np0005596062 NetworkManager[48993]: <info>  [1769452813.6112] device (tap2e588806-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:40:13 np0005596062 NetworkManager[48993]: <info>  [1769452813.6122] device (tap2e588806-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:40:13 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:13Z|00228|binding|INFO|Setting lport 2e588806-3c53-401a-90f3-537e4176dcfe up in Southbound
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.676 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:50:d1 10.100.0.7'], port_security=['fa:16:3e:24:50:d1 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b81e40ad-cba8-4851-8245-5c3eb983b479', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82e3f39f-8d87-4e62-a668-ee902f53c144', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff649c44-332a-4be4-82da-382a0117f640', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a7598a0-01e1-4002-824f-2c7bac3a3915, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=2e588806-3c53-401a-90f3-537e4176dcfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.677 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 2e588806-3c53-401a-90f3-537e4176dcfe in datapath 82e3f39f-8d87-4e62-a668-ee902f53c144 bound to our chassis#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.679 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82e3f39f-8d87-4e62-a668-ee902f53c144#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.691 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f975da12-3884-436f-9410-f87f7a312849]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.692 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82e3f39f-81 in ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.696 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82e3f39f-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.696 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4a89ec-7ece-4690-b7ea-573f88c99273]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.698 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4e71e524-bff0-4f97-9e07-e28ee88bde9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.713 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca4c35b-4ed5-49bc-958b-71d9282f8073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.738 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[61263b73-ce40-4366-a332-07fa9062c63f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.765 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[cda9d592-a801-44f8-8aea-fb397722b757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.770 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[95b39e7c-45da-4438-ae58-5c48d9e8b2b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 systemd-udevd[259780]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:40:13 np0005596062 NetworkManager[48993]: <info>  [1769452813.7723] manager: (tap82e3f39f-80): new Veth device (/org/freedesktop/NetworkManager/Devices/110)
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.808 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[c09f14b7-3d43-4be1-8879-823a7301fcc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.811 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd66d20-d10a-45a6-85b9-845e2f4e8065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 NetworkManager[48993]: <info>  [1769452813.8380] device (tap82e3f39f-80): carrier: link connected
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.845 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[09aae41c-52cc-427f-a3dd-5f4c17a7ae94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.866 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[908ef1ae-8851-4b27-8b51-aa81206c3935]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82e3f39f-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:76:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 657388, 'reachable_time': 26445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259811, 'error': None, 'target': 'ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.882 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[271f55c0-d456-461a-b53f-d0d33c8bbe76]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:7677'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 657388, 'tstamp': 657388}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259812, 'error': None, 'target': 'ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.905 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[f54c0db3-3569-43d8-8565-5028955bf8d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82e3f39f-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:76:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 657388, 'reachable_time': 26445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259813, 'error': None, 'target': 'ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.910 227317 DEBUG nova.compute.manager [req-9f9b578f-69df-42d2-a775-da7bf8257ddb req-2009e629-1d79-4f1d-b920-0c9b4b8c239a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.911 227317 DEBUG oslo_concurrency.lockutils [req-9f9b578f-69df-42d2-a775-da7bf8257ddb req-2009e629-1d79-4f1d-b920-0c9b4b8c239a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.911 227317 DEBUG oslo_concurrency.lockutils [req-9f9b578f-69df-42d2-a775-da7bf8257ddb req-2009e629-1d79-4f1d-b920-0c9b4b8c239a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.912 227317 DEBUG oslo_concurrency.lockutils [req-9f9b578f-69df-42d2-a775-da7bf8257ddb req-2009e629-1d79-4f1d-b920-0c9b4b8c239a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:13 np0005596062 nova_compute[227313]: 2026-01-26 18:40:13.912 227317 DEBUG nova.compute.manager [req-9f9b578f-69df-42d2-a775-da7bf8257ddb req-2009e629-1d79-4f1d-b920-0c9b4b8c239a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Processing event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:40:13 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:13.941 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[73220da2-9288-43d5-af95-2033acef7cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:13.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.000 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c5631f47-faf4-4a9f-a276-9594fd347bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.001 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82e3f39f-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.001 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.002 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82e3f39f-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:14 np0005596062 kernel: tap82e3f39f-80: entered promiscuous mode
Jan 26 13:40:14 np0005596062 NetworkManager[48993]: <info>  [1769452814.0044] manager: (tap82e3f39f-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.003 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.012 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82e3f39f-80, col_values=(('external_ids', {'iface-id': 'e9b59e49-0dfa-4e26-ac57-5b753f5687f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:14 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:14Z|00229|binding|INFO|Releasing lport e9b59e49-0dfa-4e26-ac57-5b753f5687f0 from this chassis (sb_readonly=0)
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.014 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.032 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82e3f39f-8d87-4e62-a668-ee902f53c144.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82e3f39f-8d87-4e62-a668-ee902f53c144.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.035 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.035 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[67a5072b-a679-4d28-bfdc-c2c5f7b8b813]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.037 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-82e3f39f-8d87-4e62-a668-ee902f53c144
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/82e3f39f-8d87-4e62-a668-ee902f53c144.pid.haproxy
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 82e3f39f-8d87-4e62-a668-ee902f53c144
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.038 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144', 'env', 'PROCESS_TAG=haproxy-82e3f39f-8d87-4e62-a668-ee902f53c144', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82e3f39f-8d87-4e62-a668-ee902f53c144.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.319 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.320 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452814.31856, b81e40ad-cba8-4851-8245-5c3eb983b479 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.320 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] VM Started (Lifecycle Event)#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.326 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.330 227317 INFO nova.virt.libvirt.driver [-] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Instance spawned successfully.#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.331 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.340 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.345 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.390 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.391 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452814.319838, b81e40ad-cba8-4851-8245-5c3eb983b479 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.391 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.394 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.394 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.395 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.395 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.396 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.396 227317 DEBUG nova.virt.libvirt.driver [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:40:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:14.439 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.441 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.447 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769452814.3257363, b81e40ad-cba8-4851-8245-5c3eb983b479 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.448 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.468 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.472 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.475 227317 INFO nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Took 6.86 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.475 227317 DEBUG nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:40:14 np0005596062 podman[259886]: 2026-01-26 18:40:14.499159013 +0000 UTC m=+0.070685167 container create 7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.501 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.536 227317 INFO nova.compute.manager [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Took 8.22 seconds to build instance.#033[00m
Jan 26 13:40:14 np0005596062 systemd[1]: Started libpod-conmon-7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b.scope.
Jan 26 13:40:14 np0005596062 nova_compute[227313]: 2026-01-26 18:40:14.553 227317 DEBUG oslo_concurrency.lockutils [None req-34427c68-60d0-4993-9c97-3ebdc08eb36c ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:14 np0005596062 podman[259886]: 2026-01-26 18:40:14.461268677 +0000 UTC m=+0.032794851 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:40:14 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:40:14 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb0742539045ef6f46ca85a2765f691d7a3ca78b86803ec97377b6777ee2feaf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:40:14 np0005596062 podman[259886]: 2026-01-26 18:40:14.595500838 +0000 UTC m=+0.167027012 container init 7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 13:40:14 np0005596062 podman[259886]: 2026-01-26 18:40:14.606650797 +0000 UTC m=+0.178176961 container start 7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:40:14 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [NOTICE]   (259905) : New worker (259907) forked
Jan 26 13:40:14 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [NOTICE]   (259905) : Loading success.
Jan 26 13:40:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:15.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:15.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.077 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.078 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.078 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.078 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.078 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.177 227317 DEBUG nova.compute.manager [req-12ff4953-d597-46aa-846f-4cd50af36b47 req-da8144f2-cb01-459c-bc9b-066125f0bf19 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.178 227317 DEBUG oslo_concurrency.lockutils [req-12ff4953-d597-46aa-846f-4cd50af36b47 req-da8144f2-cb01-459c-bc9b-066125f0bf19 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.178 227317 DEBUG oslo_concurrency.lockutils [req-12ff4953-d597-46aa-846f-4cd50af36b47 req-da8144f2-cb01-459c-bc9b-066125f0bf19 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.178 227317 DEBUG oslo_concurrency.lockutils [req-12ff4953-d597-46aa-846f-4cd50af36b47 req-da8144f2-cb01-459c-bc9b-066125f0bf19 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.179 227317 DEBUG nova.compute.manager [req-12ff4953-d597-46aa-846f-4cd50af36b47 req-da8144f2-cb01-459c-bc9b-066125f0bf19 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] No waiting events found dispatching network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.179 227317 WARNING nova.compute.manager [req-12ff4953-d597-46aa-846f-4cd50af36b47 req-da8144f2-cb01-459c-bc9b-066125f0bf19 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received unexpected event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe for instance with vm_state active and task_state None.#033[00m
Jan 26 13:40:16 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:16Z|00230|binding|INFO|Releasing lport e9b59e49-0dfa-4e26-ac57-5b753f5687f0 from this chassis (sb_readonly=0)
Jan 26 13:40:16 np0005596062 NetworkManager[48993]: <info>  [1769452816.3880] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 26 13:40:16 np0005596062 NetworkManager[48993]: <info>  [1769452816.3891] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.389 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:16 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:16Z|00231|binding|INFO|Releasing lport e9b59e49-0dfa-4e26-ac57-5b753f5687f0 from this chassis (sb_readonly=0)
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.420 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.425 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:40:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1101953315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.524 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.586 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000001b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.587 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000001b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.637 227317 DEBUG nova.compute.manager [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-changed-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.637 227317 DEBUG nova.compute.manager [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Refreshing instance network info cache due to event network-changed-2e588806-3c53-401a-90f3-537e4176dcfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.638 227317 DEBUG oslo_concurrency.lockutils [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.638 227317 DEBUG oslo_concurrency.lockutils [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.638 227317 DEBUG nova.network.neutron [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Refreshing network info cache for port 2e588806-3c53-401a-90f3-537e4176dcfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.724 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.725 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4503MB free_disk=20.967525482177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.726 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.726 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.836 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance b81e40ad-cba8-4851-8245-5c3eb983b479 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.837 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.837 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:40:16 np0005596062 nova_compute[227313]: 2026-01-26 18:40:16.878 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:40:17 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3383016004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.308 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.314 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.329 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.354 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.354 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.412 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:17 np0005596062 nova_compute[227313]: 2026-01-26 18:40:17.570 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:17.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:19.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:19.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:20 np0005596062 nova_compute[227313]: 2026-01-26 18:40:20.392 227317 DEBUG nova.network.neutron [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updated VIF entry in instance network info cache for port 2e588806-3c53-401a-90f3-537e4176dcfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:40:20 np0005596062 nova_compute[227313]: 2026-01-26 18:40:20.392 227317 DEBUG nova.network.neutron [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:20 np0005596062 nova_compute[227313]: 2026-01-26 18:40:20.412 227317 DEBUG oslo_concurrency.lockutils [req-0f22e093-8b25-44e2-bf19-cc3999ab53ef req-db8b44eb-f288-404c-a3ee-09b13c84c9e8 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:20 np0005596062 podman[259965]: 2026-01-26 18:40:20.876233749 +0000 UTC m=+0.082931096 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 13:40:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:21.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:21.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:22 np0005596062 nova_compute[227313]: 2026-01-26 18:40:22.414 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:22 np0005596062 nova_compute[227313]: 2026-01-26 18:40:22.571 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:23.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:23.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:25.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:25.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.355 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.375 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.375 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.375 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.888 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.888 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.888 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:40:26 np0005596062 nova_compute[227313]: 2026-01-26 18:40:26.888 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b81e40ad-cba8-4851-8245-5c3eb983b479 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:40:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:27 np0005596062 nova_compute[227313]: 2026-01-26 18:40:27.415 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:27.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:27 np0005596062 nova_compute[227313]: 2026-01-26 18:40:27.574 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:27 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:27Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:50:d1 10.100.0.7
Jan 26 13:40:27 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:27Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:50:d1 10.100.0.7
Jan 26 13:40:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:27.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.554 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.628 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.629 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.629 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.629 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.629 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.629 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:28 np0005596062 nova_compute[227313]: 2026-01-26 18:40:28.630 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:40:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:29.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:29.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:31 np0005596062 nova_compute[227313]: 2026-01-26 18:40:31.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:31 np0005596062 nova_compute[227313]: 2026-01-26 18:40:31.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:31.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:31.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:32 np0005596062 nova_compute[227313]: 2026-01-26 18:40:32.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:40:32 np0005596062 nova_compute[227313]: 2026-01-26 18:40:32.436 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:32 np0005596062 nova_compute[227313]: 2026-01-26 18:40:32.576 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:32 np0005596062 podman[260043]: 2026-01-26 18:40:32.880918793 +0000 UTC m=+0.091711401 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 13:40:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:33.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:33 np0005596062 nova_compute[227313]: 2026-01-26 18:40:33.912 227317 INFO nova.compute.manager [None req-0c7b04be-55ee-4471-a3ff-eb8d287b4745 ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Get console output#033[00m
Jan 26 13:40:33 np0005596062 nova_compute[227313]: 2026-01-26 18:40:33.921 254751 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 13:40:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:33.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:35.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:35.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:37 np0005596062 nova_compute[227313]: 2026-01-26 18:40:37.438 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:37.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:37 np0005596062 nova_compute[227313]: 2026-01-26 18:40:37.577 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:37.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:39.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:39 np0005596062 nova_compute[227313]: 2026-01-26 18:40:39.827 227317 DEBUG oslo_concurrency.lockutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:39 np0005596062 nova_compute[227313]: 2026-01-26 18:40:39.827 227317 DEBUG oslo_concurrency.lockutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:39 np0005596062 nova_compute[227313]: 2026-01-26 18:40:39.827 227317 DEBUG nova.network.neutron [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:40:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:39.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:41.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:41 np0005596062 nova_compute[227313]: 2026-01-26 18:40:41.542 227317 DEBUG nova.network.neutron [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:41 np0005596062 nova_compute[227313]: 2026-01-26 18:40:41.558 227317 DEBUG oslo_concurrency.lockutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:41 np0005596062 nova_compute[227313]: 2026-01-26 18:40:41.632 227317 DEBUG nova.virt.libvirt.driver [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 26 13:40:41 np0005596062 nova_compute[227313]: 2026-01-26 18:40:41.633 227317 DEBUG nova.virt.libvirt.volume.remotefs [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Creating file /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/0be4b3f3721f44b0bf07fd33fcbd9010.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 26 13:40:41 np0005596062 nova_compute[227313]: 2026-01-26 18:40:41.633 227317 DEBUG oslo_concurrency.processutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/0be4b3f3721f44b0bf07fd33fcbd9010.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:41.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.068 227317 DEBUG oslo_concurrency.processutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/0be4b3f3721f44b0bf07fd33fcbd9010.tmp" returned: 1 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.068 227317 DEBUG oslo_concurrency.processutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479/0be4b3f3721f44b0bf07fd33fcbd9010.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.069 227317 DEBUG nova.virt.libvirt.volume.remotefs [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Creating directory /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.069 227317 DEBUG oslo_concurrency.processutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:40:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.271 227317 DEBUG oslo_concurrency.processutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/b81e40ad-cba8-4851-8245-5c3eb983b479" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.277 227317 DEBUG nova.virt.libvirt.driver [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.439 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:42 np0005596062 nova_compute[227313]: 2026-01-26 18:40:42.579 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:43.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:43.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:45 np0005596062 kernel: tap2e588806-3c (unregistering): left promiscuous mode
Jan 26 13:40:45 np0005596062 NetworkManager[48993]: <info>  [1769452845.3703] device (tap2e588806-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:40:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:45Z|00232|binding|INFO|Releasing lport 2e588806-3c53-401a-90f3-537e4176dcfe from this chassis (sb_readonly=0)
Jan 26 13:40:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:45Z|00233|binding|INFO|Setting lport 2e588806-3c53-401a-90f3-537e4176dcfe down in Southbound
Jan 26 13:40:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:40:45Z|00234|binding|INFO|Removing iface tap2e588806-3c ovn-installed in OVS
Jan 26 13:40:45 np0005596062 nova_compute[227313]: 2026-01-26 18:40:45.383 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:45 np0005596062 nova_compute[227313]: 2026-01-26 18:40:45.386 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:45 np0005596062 nova_compute[227313]: 2026-01-26 18:40:45.408 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:45 np0005596062 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 26 13:40:45 np0005596062 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Consumed 14.446s CPU time.
Jan 26 13:40:45 np0005596062 systemd-machined[195380]: Machine qemu-21-instance-0000001b terminated.
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.534 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:50:d1 10.100.0.7'], port_security=['fa:16:3e:24:50:d1 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b81e40ad-cba8-4851-8245-5c3eb983b479', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82e3f39f-8d87-4e62-a668-ee902f53c144', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '301bad5c2066428fa7f214024672bf92', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ff649c44-332a-4be4-82da-382a0117f640', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.213'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a7598a0-01e1-4002-824f-2c7bac3a3915, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=2e588806-3c53-401a-90f3-537e4176dcfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:40:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:45.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.536 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 2e588806-3c53-401a-90f3-537e4176dcfe in datapath 82e3f39f-8d87-4e62-a668-ee902f53c144 unbound from our chassis#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.538 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82e3f39f-8d87-4e62-a668-ee902f53c144, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.540 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[acc12535-9b30-475a-9839-c2952790d58d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.540 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144 namespace which is not needed anymore#033[00m
Jan 26 13:40:45 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [NOTICE]   (259905) : haproxy version is 2.8.14-c23fe91
Jan 26 13:40:45 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [NOTICE]   (259905) : path to executable is /usr/sbin/haproxy
Jan 26 13:40:45 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [WARNING]  (259905) : Exiting Master process...
Jan 26 13:40:45 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [ALERT]    (259905) : Current worker (259907) exited with code 143 (Terminated)
Jan 26 13:40:45 np0005596062 neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144[259901]: [WARNING]  (259905) : All workers exited. Exiting... (0)
Jan 26 13:40:45 np0005596062 systemd[1]: libpod-7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b.scope: Deactivated successfully.
Jan 26 13:40:45 np0005596062 podman[260162]: 2026-01-26 18:40:45.695566369 +0000 UTC m=+0.045253605 container died 7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:40:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b-userdata-shm.mount: Deactivated successfully.
Jan 26 13:40:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay-fb0742539045ef6f46ca85a2765f691d7a3ca78b86803ec97377b6777ee2feaf-merged.mount: Deactivated successfully.
Jan 26 13:40:45 np0005596062 podman[260162]: 2026-01-26 18:40:45.730973009 +0000 UTC m=+0.080660245 container cleanup 7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:40:45 np0005596062 systemd[1]: libpod-conmon-7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b.scope: Deactivated successfully.
Jan 26 13:40:45 np0005596062 podman[260189]: 2026-01-26 18:40:45.792096978 +0000 UTC m=+0.041692099 container remove 7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.797 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5b74ed-26fa-4261-b0eb-a8c5a29ee5c3]: (4, ('Mon Jan 26 06:40:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144 (7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b)\n7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b\nMon Jan 26 06:40:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144 (7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b)\n7c88f222d358b72d4f34babed9b23eac228abe02c1152a712ab94de26ac4a05b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.798 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[a58a47ba-0757-4ac7-83e9-dfb20bc28490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.799 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82e3f39f-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:45 np0005596062 kernel: tap82e3f39f-80: left promiscuous mode
Jan 26 13:40:45 np0005596062 nova_compute[227313]: 2026-01-26 18:40:45.840 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:45 np0005596062 nova_compute[227313]: 2026-01-26 18:40:45.862 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.864 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[91e7391c-ed38-49c5-80dd-c949b08eb486]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.887 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[5e84cbbf-ecd3-49d7-9dca-4f3ea2b98299]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.888 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[de9ce742-0677-457d-adce-6d542845bb8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.909 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[de8b0aea-958c-4cfe-a23e-53fbd818db2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 657381, 'reachable_time': 38736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260207, 'error': None, 'target': 'ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.911 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82e3f39f-8d87-4e62-a668-ee902f53c144 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:40:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:40:45.911 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[dc6e10d8-37cd-46b0-8719-060ccd423818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:40:45 np0005596062 systemd[1]: run-netns-ovnmeta\x2d82e3f39f\x2d8d87\x2d4e62\x2da668\x2dee902f53c144.mount: Deactivated successfully.
Jan 26 13:40:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:45.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.086 227317 DEBUG nova.compute.manager [req-35b8fb08-ad66-4629-8984-93930e480573 req-aaec5f7c-1c57-40e8-8447-ad06421f0c46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-vif-unplugged-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.087 227317 DEBUG oslo_concurrency.lockutils [req-35b8fb08-ad66-4629-8984-93930e480573 req-aaec5f7c-1c57-40e8-8447-ad06421f0c46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.087 227317 DEBUG oslo_concurrency.lockutils [req-35b8fb08-ad66-4629-8984-93930e480573 req-aaec5f7c-1c57-40e8-8447-ad06421f0c46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.087 227317 DEBUG oslo_concurrency.lockutils [req-35b8fb08-ad66-4629-8984-93930e480573 req-aaec5f7c-1c57-40e8-8447-ad06421f0c46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.088 227317 DEBUG nova.compute.manager [req-35b8fb08-ad66-4629-8984-93930e480573 req-aaec5f7c-1c57-40e8-8447-ad06421f0c46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] No waiting events found dispatching network-vif-unplugged-2e588806-3c53-401a-90f3-537e4176dcfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.088 227317 WARNING nova.compute.manager [req-35b8fb08-ad66-4629-8984-93930e480573 req-aaec5f7c-1c57-40e8-8447-ad06421f0c46 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received unexpected event network-vif-unplugged-2e588806-3c53-401a-90f3-537e4176dcfe for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.299 227317 INFO nova.virt.libvirt.driver [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Instance shutdown successfully after 4 seconds.#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.305 227317 INFO nova.virt.libvirt.driver [-] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Instance destroyed successfully.#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.306 227317 DEBUG nova.virt.libvirt.vif [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-766514569',display_name='tempest-TestNetworkAdvancedServerOps-server-766514569',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-766514569',id=27,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7uQVm9s7C+OqbAh1CIPBxJi+6AkyPpWOPYYV7DcXbtYqg7663H86MBmiolT3Uacef2LD9/V7P8RfgEuQwZCVENs2yHMAD4P9rcdlzFL0K8Hhq6UoTOylf5rcW9T4i1Qg==',key_name='tempest-TestNetworkAdvancedServerOps-706838647',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:40:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-rq7teih3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:40:37Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b81e40ad-cba8-4851-8245-5c3eb983b479,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1049565076", "vif_mac": "fa:16:3e:24:50:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.307 227317 DEBUG nova.network.os_vif_util [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1049565076", "vif_mac": "fa:16:3e:24:50:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.307 227317 DEBUG nova.network.os_vif_util [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.308 227317 DEBUG os_vif [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.309 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.310 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e588806-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.311 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.312 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.315 227317 INFO os_vif [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c')#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.320 227317 DEBUG nova.virt.libvirt.driver [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] skipping disk for instance-0000001b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.321 227317 DEBUG nova.virt.libvirt.driver [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] skipping disk for instance-0000001b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.445 227317 DEBUG neutronclient.v2_0.client [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 2e588806-3c53-401a-90f3-537e4176dcfe for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.575 227317 DEBUG oslo_concurrency.lockutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.575 227317 DEBUG oslo_concurrency.lockutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:46 np0005596062 nova_compute[227313]: 2026-01-26 18:40:46.576 227317 DEBUG oslo_concurrency.lockutils [None req-a12c5a84-e832-4e79-b03e-68979bdf7f4f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:47 np0005596062 nova_compute[227313]: 2026-01-26 18:40:47.441 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:47 np0005596062 nova_compute[227313]: 2026-01-26 18:40:47.506 227317 DEBUG nova.compute.manager [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-changed-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:47 np0005596062 nova_compute[227313]: 2026-01-26 18:40:47.506 227317 DEBUG nova.compute.manager [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Refreshing instance network info cache due to event network-changed-2e588806-3c53-401a-90f3-537e4176dcfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:40:47 np0005596062 nova_compute[227313]: 2026-01-26 18:40:47.506 227317 DEBUG oslo_concurrency.lockutils [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:47 np0005596062 nova_compute[227313]: 2026-01-26 18:40:47.507 227317 DEBUG oslo_concurrency.lockutils [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:47 np0005596062 nova_compute[227313]: 2026-01-26 18:40:47.507 227317 DEBUG nova.network.neutron [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Refreshing network info cache for port 2e588806-3c53-401a-90f3-537e4176dcfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:40:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:48.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:48 np0005596062 nova_compute[227313]: 2026-01-26 18:40:48.161 227317 DEBUG nova.compute.manager [req-5f1b7a57-5ac9-42ef-8ebd-813a4f243c22 req-9e2e0eb0-e415-45bf-9fb6-caa35b48e7f0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:48 np0005596062 nova_compute[227313]: 2026-01-26 18:40:48.162 227317 DEBUG oslo_concurrency.lockutils [req-5f1b7a57-5ac9-42ef-8ebd-813a4f243c22 req-9e2e0eb0-e415-45bf-9fb6-caa35b48e7f0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:48 np0005596062 nova_compute[227313]: 2026-01-26 18:40:48.162 227317 DEBUG oslo_concurrency.lockutils [req-5f1b7a57-5ac9-42ef-8ebd-813a4f243c22 req-9e2e0eb0-e415-45bf-9fb6-caa35b48e7f0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:48 np0005596062 nova_compute[227313]: 2026-01-26 18:40:48.162 227317 DEBUG oslo_concurrency.lockutils [req-5f1b7a57-5ac9-42ef-8ebd-813a4f243c22 req-9e2e0eb0-e415-45bf-9fb6-caa35b48e7f0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:48 np0005596062 nova_compute[227313]: 2026-01-26 18:40:48.163 227317 DEBUG nova.compute.manager [req-5f1b7a57-5ac9-42ef-8ebd-813a4f243c22 req-9e2e0eb0-e415-45bf-9fb6-caa35b48e7f0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] No waiting events found dispatching network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:40:48 np0005596062 nova_compute[227313]: 2026-01-26 18:40:48.163 227317 WARNING nova.compute.manager [req-5f1b7a57-5ac9-42ef-8ebd-813a4f243c22 req-9e2e0eb0-e415-45bf-9fb6-caa35b48e7f0 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received unexpected event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 26 13:40:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:40:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:40:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:40:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:40:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:40:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:49.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:49 np0005596062 nova_compute[227313]: 2026-01-26 18:40:49.730 227317 DEBUG nova.network.neutron [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updated VIF entry in instance network info cache for port 2e588806-3c53-401a-90f3-537e4176dcfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:40:49 np0005596062 nova_compute[227313]: 2026-01-26 18:40:49.731 227317 DEBUG nova.network.neutron [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:49 np0005596062 nova_compute[227313]: 2026-01-26 18:40:49.747 227317 DEBUG oslo_concurrency.lockutils [req-35c47ea9-5abc-469d-ae09-9e5c3a6dbfd9 req-b8309978-0415-42e9-927b-0a790469acaf 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:50.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 26 13:40:51 np0005596062 nova_compute[227313]: 2026-01-26 18:40:51.312 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:51.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:51 np0005596062 podman[260342]: 2026-01-26 18:40:51.899846799 +0000 UTC m=+0.092059841 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 26 13:40:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:52.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:52 np0005596062 nova_compute[227313]: 2026-01-26 18:40:52.443 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:53.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:54.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:54 np0005596062 nova_compute[227313]: 2026-01-26 18:40:54.585 227317 DEBUG nova.compute.manager [req-0210de89-6648-4a08-8373-04f7f0e7a53d req-e44f873c-5a8c-4818-97e3-fb75ec2c805a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:54 np0005596062 nova_compute[227313]: 2026-01-26 18:40:54.585 227317 DEBUG oslo_concurrency.lockutils [req-0210de89-6648-4a08-8373-04f7f0e7a53d req-e44f873c-5a8c-4818-97e3-fb75ec2c805a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:54 np0005596062 nova_compute[227313]: 2026-01-26 18:40:54.585 227317 DEBUG oslo_concurrency.lockutils [req-0210de89-6648-4a08-8373-04f7f0e7a53d req-e44f873c-5a8c-4818-97e3-fb75ec2c805a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:54 np0005596062 nova_compute[227313]: 2026-01-26 18:40:54.585 227317 DEBUG oslo_concurrency.lockutils [req-0210de89-6648-4a08-8373-04f7f0e7a53d req-e44f873c-5a8c-4818-97e3-fb75ec2c805a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:54 np0005596062 nova_compute[227313]: 2026-01-26 18:40:54.586 227317 DEBUG nova.compute.manager [req-0210de89-6648-4a08-8373-04f7f0e7a53d req-e44f873c-5a8c-4818-97e3-fb75ec2c805a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] No waiting events found dispatching network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:40:54 np0005596062 nova_compute[227313]: 2026-01-26 18:40:54.586 227317 WARNING nova.compute.manager [req-0210de89-6648-4a08-8373-04f7f0e7a53d req-e44f873c-5a8c-4818-97e3-fb75ec2c805a 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received unexpected event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe for instance with vm_state resized and task_state None.#033[00m
Jan 26 13:40:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:55.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:40:55 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:40:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:40:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:56.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.313 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.412 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.413 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.413 227317 DEBUG nova.compute.manager [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Going to confirm migration 6 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.693 227317 DEBUG nova.compute.manager [req-cab344ae-4ed4-4551-8558-cf0e1fdd54bf req-a3009b4b-4be1-4234-8803-a9af2250e89d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.694 227317 DEBUG oslo_concurrency.lockutils [req-cab344ae-4ed4-4551-8558-cf0e1fdd54bf req-a3009b4b-4be1-4234-8803-a9af2250e89d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.694 227317 DEBUG oslo_concurrency.lockutils [req-cab344ae-4ed4-4551-8558-cf0e1fdd54bf req-a3009b4b-4be1-4234-8803-a9af2250e89d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.694 227317 DEBUG oslo_concurrency.lockutils [req-cab344ae-4ed4-4551-8558-cf0e1fdd54bf req-a3009b4b-4be1-4234-8803-a9af2250e89d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.695 227317 DEBUG nova.compute.manager [req-cab344ae-4ed4-4551-8558-cf0e1fdd54bf req-a3009b4b-4be1-4234-8803-a9af2250e89d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] No waiting events found dispatching network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:40:56 np0005596062 nova_compute[227313]: 2026-01-26 18:40:56.695 227317 WARNING nova.compute.manager [req-cab344ae-4ed4-4551-8558-cf0e1fdd54bf req-a3009b4b-4be1-4234-8803-a9af2250e89d 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Received unexpected event network-vif-plugged-2e588806-3c53-401a-90f3-537e4176dcfe for instance with vm_state resized and task_state None.#033[00m
Jan 26 13:40:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:40:57 np0005596062 nova_compute[227313]: 2026-01-26 18:40:57.360 227317 DEBUG neutronclient.v2_0.client [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 2e588806-3c53-401a-90f3-537e4176dcfe for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 26 13:40:57 np0005596062 nova_compute[227313]: 2026-01-26 18:40:57.361 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:40:57 np0005596062 nova_compute[227313]: 2026-01-26 18:40:57.361 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquired lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:40:57 np0005596062 nova_compute[227313]: 2026-01-26 18:40:57.361 227317 DEBUG nova.network.neutron [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:40:57 np0005596062 nova_compute[227313]: 2026-01-26 18:40:57.361 227317 DEBUG nova.objects.instance [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'info_cache' on Instance uuid b81e40ad-cba8-4851-8245-5c3eb983b479 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:40:57 np0005596062 nova_compute[227313]: 2026-01-26 18:40:57.445 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:40:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:57.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:40:58.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:40:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:40:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:40:59.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:40:59 np0005596062 nova_compute[227313]: 2026-01-26 18:40:59.586 227317 DEBUG nova.network.neutron [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Updating instance_info_cache with network_info: [{"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:40:59 np0005596062 nova_compute[227313]: 2026-01-26 18:40:59.670 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Releasing lock "refresh_cache-b81e40ad-cba8-4851-8245-5c3eb983b479" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:40:59 np0005596062 nova_compute[227313]: 2026-01-26 18:40:59.671 227317 DEBUG nova.objects.instance [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lazy-loading 'migration_context' on Instance uuid b81e40ad-cba8-4851-8245-5c3eb983b479 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:40:59 np0005596062 nova_compute[227313]: 2026-01-26 18:40:59.921 227317 DEBUG nova.storage.rbd_utils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] removing snapshot(nova-resize) on rbd image(b81e40ad-cba8-4851-8245-5c3eb983b479_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 26 13:41:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:00.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.624 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769452845.6232562, b81e40ad-cba8-4851-8245-5c3eb983b479 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.626 227317 INFO nova.compute.manager [-] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.645 227317 DEBUG nova.compute.manager [None req-c1207a30-537d-48fe-b5b7-3f112a4a7530 - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.647 227317 DEBUG nova.compute.manager [None req-c1207a30-537d-48fe-b5b7-3f112a4a7530 - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.676 227317 INFO nova.compute.manager [None req-c1207a30-537d-48fe-b5b7-3f112a4a7530 - - - - - -] [instance: b81e40ad-cba8-4851-8245-5c3eb983b479] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 26 13:41:00 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.972 227317 DEBUG nova.virt.libvirt.vif [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:40:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-766514569',display_name='tempest-TestNetworkAdvancedServerOps-server-766514569',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-766514569',id=27,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7uQVm9s7C+OqbAh1CIPBxJi+6AkyPpWOPYYV7DcXbtYqg7663H86MBmiolT3Uacef2LD9/V7P8RfgEuQwZCVENs2yHMAD4P9rcdlzFL0K8Hhq6UoTOylf5rcW9T4i1Qg==',key_name='tempest-TestNetworkAdvancedServerOps-706838647',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:40:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='301bad5c2066428fa7f214024672bf92',ramdisk_id='',reservation_id='r-rq7teih3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1357272614',owner_user_name='tempest-TestNetworkAdvancedServerOps-1357272614-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:40:54Z,user_data=None,user_id='ffa1cd7ba9e543f78f2ef48c2a7a67a2',uuid=b81e40ad-cba8-4851-8245-5c3eb983b479,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.973 227317 DEBUG nova.network.os_vif_util [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converting VIF {"id": "2e588806-3c53-401a-90f3-537e4176dcfe", "address": "fa:16:3e:24:50:d1", "network": {"id": "82e3f39f-8d87-4e62-a668-ee902f53c144", "bridge": "br-int", "label": "tempest-network-smoke--1049565076", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "301bad5c2066428fa7f214024672bf92", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e588806-3c", "ovs_interfaceid": "2e588806-3c53-401a-90f3-537e4176dcfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.973 227317 DEBUG nova.network.os_vif_util [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.974 227317 DEBUG os_vif [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.975 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.975 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e588806-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.976 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.977 227317 INFO os_vif [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:50:d1,bridge_name='br-int',has_traffic_filtering=True,id=2e588806-3c53-401a-90f3-537e4176dcfe,network=Network(82e3f39f-8d87-4e62-a668-ee902f53c144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e588806-3c')#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.978 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:41:00 np0005596062 nova_compute[227313]: 2026-01-26 18:41:00.978 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:41:01 np0005596062 nova_compute[227313]: 2026-01-26 18:41:01.075 227317 DEBUG oslo_concurrency.processutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:41:01 np0005596062 nova_compute[227313]: 2026-01-26 18:41:01.315 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:41:01 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2364324357' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:41:01 np0005596062 nova_compute[227313]: 2026-01-26 18:41:01.499 227317 DEBUG oslo_concurrency.processutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:41:01 np0005596062 nova_compute[227313]: 2026-01-26 18:41:01.504 227317 DEBUG nova.compute.provider_tree [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:41:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:01.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:01 np0005596062 nova_compute[227313]: 2026-01-26 18:41:01.581 227317 DEBUG nova.scheduler.client.report [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:41:01 np0005596062 nova_compute[227313]: 2026-01-26 18:41:01.658 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:02.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:02 np0005596062 nova_compute[227313]: 2026-01-26 18:41:02.449 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:02 np0005596062 nova_compute[227313]: 2026-01-26 18:41:02.611 227317 INFO nova.scheduler.client.report [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Deleted allocation for migration cca79c36-9a99-47b8-a0b0-1908c615a3bc#033[00m
Jan 26 13:41:03 np0005596062 nova_compute[227313]: 2026-01-26 18:41:03.271 227317 DEBUG oslo_concurrency.lockutils [None req-19bc5e5e-e4c5-4a5a-91c9-fc7fe3b7525f ffa1cd7ba9e543f78f2ef48c2a7a67a2 301bad5c2066428fa7f214024672bf92 - - default default] Lock "b81e40ad-cba8-4851-8245-5c3eb983b479" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:03.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:03 np0005596062 podman[260526]: 2026-01-26 18:41:03.899662162 +0000 UTC m=+0.102729407 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 13:41:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:04.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:05.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:06.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:06 np0005596062 nova_compute[227313]: 2026-01-26 18:41:06.318 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:06 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 26 13:41:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:07 np0005596062 nova_compute[227313]: 2026-01-26 18:41:07.452 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:07.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:08.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:41:09.191 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:41:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:41:09.192 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:41:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:41:09.193 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:09.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:10.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:11 np0005596062 nova_compute[227313]: 2026-01-26 18:41:11.320 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:11.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:12.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:12 np0005596062 nova_compute[227313]: 2026-01-26 18:41:12.453 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:14.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:41:14.691 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:41:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:41:14.692 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:41:14 np0005596062 nova_compute[227313]: 2026-01-26 18:41:14.691 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:15 np0005596062 ovn_controller[133984]: 2026-01-26T18:41:15Z|00235|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 13:41:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:15.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:16.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.080 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.081 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.082 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.322 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:41:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4237528513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.533 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.695 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.697 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4689MB free_disk=20.94265365600586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.698 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.698 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.769 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.770 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:41:16 np0005596062 nova_compute[227313]: 2026-01-26 18:41:16.818 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:41:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:41:17 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1743788299' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:41:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:17 np0005596062 nova_compute[227313]: 2026-01-26 18:41:17.263 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:41:17 np0005596062 nova_compute[227313]: 2026-01-26 18:41:17.269 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:41:17 np0005596062 nova_compute[227313]: 2026-01-26 18:41:17.286 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:41:17 np0005596062 nova_compute[227313]: 2026-01-26 18:41:17.310 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:41:17 np0005596062 nova_compute[227313]: 2026-01-26 18:41:17.311 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:17 np0005596062 nova_compute[227313]: 2026-01-26 18:41:17.456 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:17.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:18.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:19.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:20.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:21 np0005596062 nova_compute[227313]: 2026-01-26 18:41:21.325 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:21.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:21 np0005596062 nova_compute[227313]: 2026-01-26 18:41:21.618 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:21 np0005596062 nova_compute[227313]: 2026-01-26 18:41:21.695 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:22.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:22 np0005596062 nova_compute[227313]: 2026-01-26 18:41:22.457 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:22 np0005596062 podman[260631]: 2026-01-26 18:41:22.858425992 +0000 UTC m=+0.060134394 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 13:41:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:23.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:24.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:24 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:41:24.693 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:41:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:25.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:26.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:26 np0005596062 nova_compute[227313]: 2026-01-26 18:41:26.327 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:27 np0005596062 nova_compute[227313]: 2026-01-26 18:41:27.459 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:27.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:28.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:28 np0005596062 nova_compute[227313]: 2026-01-26 18:41:28.311 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:28 np0005596062 nova_compute[227313]: 2026-01-26 18:41:28.312 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:41:28 np0005596062 nova_compute[227313]: 2026-01-26 18:41:28.312 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:41:28 np0005596062 nova_compute[227313]: 2026-01-26 18:41:28.338 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:41:28 np0005596062 nova_compute[227313]: 2026-01-26 18:41:28.338 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:28 np0005596062 nova_compute[227313]: 2026-01-26 18:41:28.339 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:41:29 np0005596062 nova_compute[227313]: 2026-01-26 18:41:29.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:29 np0005596062 nova_compute[227313]: 2026-01-26 18:41:29.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:29.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:30 np0005596062 nova_compute[227313]: 2026-01-26 18:41:30.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:30 np0005596062 nova_compute[227313]: 2026-01-26 18:41:30.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:30.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:31 np0005596062 nova_compute[227313]: 2026-01-26 18:41:31.348 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:31.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:32.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:32 np0005596062 nova_compute[227313]: 2026-01-26 18:41:32.461 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:33 np0005596062 nova_compute[227313]: 2026-01-26 18:41:33.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:33 np0005596062 nova_compute[227313]: 2026-01-26 18:41:33.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:33.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:34.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:34 np0005596062 podman[260685]: 2026-01-26 18:41:34.926647751 +0000 UTC m=+0.124900512 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:41:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:35.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:36.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:36 np0005596062 nova_compute[227313]: 2026-01-26 18:41:36.348 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:37 np0005596062 nova_compute[227313]: 2026-01-26 18:41:37.462 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:37.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:38.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:39.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:40.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:41 np0005596062 nova_compute[227313]: 2026-01-26 18:41:41.351 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:41.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:42.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:42 np0005596062 nova_compute[227313]: 2026-01-26 18:41:42.463 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:43.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:41:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 9687 writes, 48K keys, 9687 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 9687 writes, 9687 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1575 writes, 7458 keys, 1575 commit groups, 1.0 writes per commit group, ingest: 16.11 MB, 0.03 MB/s#012Interval WAL: 1576 writes, 1576 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     64.4      0.93              0.20        28    0.033       0      0       0.0       0.0#012  L6      1/0    8.77 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.0     90.4     74.5      3.22              0.74        27    0.119    155K    15K       0.0       0.0#012 Sum      1/0    8.77 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.0     70.1     72.2      4.15              0.94        55    0.075    155K    15K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.5     95.6     95.1      0.62              0.17        10    0.062     35K   2623       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0     90.4     74.5      3.22              0.74        27    0.119    155K    15K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     64.5      0.93              0.20        27    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.058, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.29 GB write, 0.08 MB/s write, 0.28 GB read, 0.08 MB/s read, 4.1 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 33.99 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000223 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1954,32.83 MB,10.7997%) FilterBlock(55,441.42 KB,0.141801%) IndexBlock(55,750.42 KB,0.241064%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 13:41:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:44.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:45.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:46.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:46 np0005596062 nova_compute[227313]: 2026-01-26 18:41:46.415 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:47 np0005596062 nova_compute[227313]: 2026-01-26 18:41:47.493 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:47.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.051 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.054 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.055 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.055 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.056 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:41:48 np0005596062 nova_compute[227313]: 2026-01-26 18:41:48.056 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:41:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:48.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:49.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:50.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.562 227317 DEBUG nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.563 227317 WARNING nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.563 227317 WARNING nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.563 227317 INFO nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Removable base files: /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.563 227317 INFO nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.563 227317 INFO nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/845aad0744c07ae3a06850747475706fc56a381e#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.564 227317 DEBUG nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.564 227317 DEBUG nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 26 13:41:50 np0005596062 nova_compute[227313]: 2026-01-26 18:41:50.564 227317 DEBUG nova.virt.libvirt.imagecache [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 26 13:41:51 np0005596062 nova_compute[227313]: 2026-01-26 18:41:51.416 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:51.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:52.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:52 np0005596062 nova_compute[227313]: 2026-01-26 18:41:52.494 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:53.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:53 np0005596062 podman[260770]: 2026-01-26 18:41:53.858595071 +0000 UTC m=+0.064632884 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:41:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:54.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:55.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:41:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:56.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:41:56 np0005596062 nova_compute[227313]: 2026-01-26 18:41:56.418 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:57 np0005596062 nova_compute[227313]: 2026-01-26 18:41:57.497 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:41:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:41:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:57.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:41:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:41:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:41:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:41:58.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:41:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:41:58 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:41:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:41:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:41:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:41:59.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:00.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:42:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:42:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:42:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:42:00 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:42:01 np0005596062 nova_compute[227313]: 2026-01-26 18:42:01.463 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:01.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:02.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:02 np0005596062 nova_compute[227313]: 2026-01-26 18:42:02.521 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:03.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:04.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:05.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:05 np0005596062 podman[261095]: 2026-01-26 18:42:05.962174109 +0000 UTC m=+0.170355251 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:42:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:06.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:06 np0005596062 nova_compute[227313]: 2026-01-26 18:42:06.465 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:42:06 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:42:07 np0005596062 nova_compute[227313]: 2026-01-26 18:42:07.522 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:07.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:42:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:08.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:42:09 np0005596062 ovn_controller[133984]: 2026-01-26T18:42:09Z|00236|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 13:42:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:42:09.192 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:42:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:42:09.193 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:42:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:42:09.193 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:42:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:09.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:10.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:11 np0005596062 nova_compute[227313]: 2026-01-26 18:42:11.467 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:12.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:12 np0005596062 nova_compute[227313]: 2026-01-26 18:42:12.523 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:13.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:14.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:15.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:42:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 18K writes, 67K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.02 MB/s#012Cumulative WAL: 18K writes, 6126 syncs, 2.97 writes per sync, written: 0.06 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2834 writes, 10K keys, 2834 commit groups, 1.0 writes per commit group, ingest: 11.09 MB, 0.02 MB/s#012Interval WAL: 2834 writes, 1118 syncs, 2.53 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:42:16 np0005596062 nova_compute[227313]: 2026-01-26 18:42:16.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:16 np0005596062 nova_compute[227313]: 2026-01-26 18:42:16.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:42:16 np0005596062 nova_compute[227313]: 2026-01-26 18:42:16.076 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:42:16 np0005596062 nova_compute[227313]: 2026-01-26 18:42:16.093 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:16 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:42:16.093 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:42:16 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:42:16.095 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:42:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:16.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:16 np0005596062 ceph-mgr[77538]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2716354406
Jan 26 13:42:16 np0005596062 nova_compute[227313]: 2026-01-26 18:42:16.470 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:17 np0005596062 nova_compute[227313]: 2026-01-26 18:42:17.525 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:17.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:18 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:42:18.098 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:42:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:42:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:18.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.176 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.177 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.177 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.177 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.177 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:42:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:42:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3083597852' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.657 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.860 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.862 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4718MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.863 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:42:18 np0005596062 nova_compute[227313]: 2026-01-26 18:42:18.863 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.309 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.310 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.342 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:42:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:19.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:42:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2171686972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.794 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.798 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.946 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.947 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.948 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.948 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:19 np0005596062 nova_compute[227313]: 2026-01-26 18:42:19.949 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:42:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:42:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:20.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:42:21 np0005596062 nova_compute[227313]: 2026-01-26 18:42:21.472 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:21.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:42:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:22.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:42:22 np0005596062 nova_compute[227313]: 2026-01-26 18:42:22.527 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:23.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:24.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:24 np0005596062 podman[261276]: 2026-01-26 18:42:24.852212317 +0000 UTC m=+0.059612111 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 13:42:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:25.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:26.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:26 np0005596062 nova_compute[227313]: 2026-01-26 18:42:26.474 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:27 np0005596062 nova_compute[227313]: 2026-01-26 18:42:27.528 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:28.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:28 np0005596062 nova_compute[227313]: 2026-01-26 18:42:28.964 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:28 np0005596062 nova_compute[227313]: 2026-01-26 18:42:28.997 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:28 np0005596062 nova_compute[227313]: 2026-01-26 18:42:28.998 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:42:28 np0005596062 nova_compute[227313]: 2026-01-26 18:42:28.998 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:42:29 np0005596062 nova_compute[227313]: 2026-01-26 18:42:29.112 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:42:29 np0005596062 nova_compute[227313]: 2026-01-26 18:42:29.113 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:29 np0005596062 nova_compute[227313]: 2026-01-26 18:42:29.113 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:29 np0005596062 nova_compute[227313]: 2026-01-26 18:42:29.113 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:42:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:29.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:30 np0005596062 nova_compute[227313]: 2026-01-26 18:42:30.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:30 np0005596062 nova_compute[227313]: 2026-01-26 18:42:30.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:30 np0005596062 nova_compute[227313]: 2026-01-26 18:42:30.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:30.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:31 np0005596062 nova_compute[227313]: 2026-01-26 18:42:31.475 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:31.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:32.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:32 np0005596062 nova_compute[227313]: 2026-01-26 18:42:32.530 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:33 np0005596062 nova_compute[227313]: 2026-01-26 18:42:33.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:33 np0005596062 nova_compute[227313]: 2026-01-26 18:42:33.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:33.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:34.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:35.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:36.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:36 np0005596062 nova_compute[227313]: 2026-01-26 18:42:36.478 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:36 np0005596062 podman[261301]: 2026-01-26 18:42:36.90085562 +0000 UTC m=+0.112823356 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 13:42:37 np0005596062 nova_compute[227313]: 2026-01-26 18:42:37.533 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:37.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:38.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:39.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:40.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:41 np0005596062 nova_compute[227313]: 2026-01-26 18:42:41.480 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:41.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:42.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:42 np0005596062 nova_compute[227313]: 2026-01-26 18:42:42.534 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:43 np0005596062 nova_compute[227313]: 2026-01-26 18:42:43.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:42:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:43.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:44.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:45.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:46.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:46 np0005596062 nova_compute[227313]: 2026-01-26 18:42:46.480 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:47 np0005596062 nova_compute[227313]: 2026-01-26 18:42:47.536 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:47.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:48.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:49.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:50.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:51 np0005596062 nova_compute[227313]: 2026-01-26 18:42:51.482 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:51.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:52.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:52 np0005596062 nova_compute[227313]: 2026-01-26 18:42:52.538 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:53.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:42:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:54.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:42:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:55.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:55 np0005596062 podman[261388]: 2026-01-26 18:42:55.895039051 +0000 UTC m=+0.098534994 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 13:42:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:56.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:56 np0005596062 nova_compute[227313]: 2026-01-26 18:42:56.484 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:57 np0005596062 nova_compute[227313]: 2026-01-26 18:42:57.539 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:42:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:57.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:42:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:42:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:42:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:42:58.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:42:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:42:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:42:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:42:59.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:00.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:00 np0005596062 nova_compute[227313]: 2026-01-26 18:43:00.607 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:01 np0005596062 nova_compute[227313]: 2026-01-26 18:43:01.487 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.003000081s ======
Jan 26 13:43:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:01.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Jan 26 13:43:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:02.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:02 np0005596062 nova_compute[227313]: 2026-01-26 18:43:02.542 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:03.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:04.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.301650) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452985301740, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 2113, "num_deletes": 252, "total_data_size": 5031061, "memory_usage": 5104472, "flush_reason": "Manual Compaction"}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452985329458, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 3276555, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 47368, "largest_seqno": 49476, "table_properties": {"data_size": 3268000, "index_size": 5241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 17798, "raw_average_key_size": 20, "raw_value_size": 3250882, "raw_average_value_size": 3715, "num_data_blocks": 228, "num_entries": 875, "num_filter_entries": 875, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452807, "oldest_key_time": 1769452807, "file_creation_time": 1769452985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 28016 microseconds, and 14086 cpu microseconds.
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.329661) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 3276555 bytes OK
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.329743) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.332246) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.332273) EVENT_LOG_v1 {"time_micros": 1769452985332265, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.332297) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 5021790, prev total WAL file size 5021790, number of live WAL files 2.
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.334729) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(3199KB)], [93(8979KB)]
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452985334831, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 12471705, "oldest_snapshot_seqno": -1}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7110 keys, 10505160 bytes, temperature: kUnknown
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452985426474, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 10505160, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10459870, "index_size": 26426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17797, "raw_key_size": 183098, "raw_average_key_size": 25, "raw_value_size": 10334433, "raw_average_value_size": 1453, "num_data_blocks": 1049, "num_entries": 7110, "num_filter_entries": 7110, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.426844) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 10505160 bytes
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.428553) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.9 rd, 114.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 8.8 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(7.0) write-amplify(3.2) OK, records in: 7635, records dropped: 525 output_compression: NoCompression
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.428575) EVENT_LOG_v1 {"time_micros": 1769452985428565, "job": 58, "event": "compaction_finished", "compaction_time_micros": 91779, "compaction_time_cpu_micros": 51076, "output_level": 6, "num_output_files": 1, "total_output_size": 10505160, "num_input_records": 7635, "num_output_records": 7110, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452985429459, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452985431374, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.334535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.431440) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.431445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.431447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.431449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:05 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:05.431451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 13:43:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:05.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 13:43:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:06.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:06 np0005596062 nova_compute[227313]: 2026-01-26 18:43:06.488 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:06 np0005596062 podman[261638]: 2026-01-26 18:43:06.979531038 +0000 UTC m=+0.085044572 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 13:43:07 np0005596062 podman[261638]: 2026-01-26 18:43:07.09401152 +0000 UTC m=+0.199525044 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 26 13:43:07 np0005596062 podman[261674]: 2026-01-26 18:43:07.282884477 +0000 UTC m=+0.125060126 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 13:43:07 np0005596062 nova_compute[227313]: 2026-01-26 18:43:07.543 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:07.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:07 np0005596062 podman[261818]: 2026-01-26 18:43:07.826212923 +0000 UTC m=+0.074617813 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:43:07 np0005596062 podman[261818]: 2026-01-26 18:43:07.837886097 +0000 UTC m=+0.086290987 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:43:08 np0005596062 podman[261882]: 2026-01-26 18:43:08.031548312 +0000 UTC m=+0.055828699 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, architecture=x86_64, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4)
Jan 26 13:43:08 np0005596062 podman[261882]: 2026-01-26 18:43:08.042502406 +0000 UTC m=+0.066782743 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=keepalived-container, vcs-type=git, name=keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.28.2)
Jan 26 13:43:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:08.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:08 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:43:09.193 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:43:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:43:09.194 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:43:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:43:09.194 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:43:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:43:09.211 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:43:09 np0005596062 nova_compute[227313]: 2026-01-26 18:43:09.212 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:43:09.212 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:43:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:09.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.765424008 +0000 UTC m=+0.071138519 container create f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_swartz, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 26 13:43:09 np0005596062 systemd[1]: Started libpod-conmon-f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c.scope.
Jan 26 13:43:09 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.739070411 +0000 UTC m=+0.044785012 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.854466557 +0000 UTC m=+0.160181118 container init f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_swartz, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.861674321 +0000 UTC m=+0.167388872 container start f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_swartz, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.866375977 +0000 UTC m=+0.172090578 container attach f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_swartz, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 13:43:09 np0005596062 happy_swartz[262206]: 167 167
Jan 26 13:43:09 np0005596062 systemd[1]: libpod-f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c.scope: Deactivated successfully.
Jan 26 13:43:09 np0005596062 conmon[262206]: conmon f99291aeaeac49809b41 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c.scope/container/memory.events
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.869290015 +0000 UTC m=+0.175004556 container died f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_swartz, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 26 13:43:09 np0005596062 systemd[1]: var-lib-containers-storage-overlay-0e804a8c17e4eced13e1912a0deb46b4b711594b3dbd5a03ca8abb7d094471a6-merged.mount: Deactivated successfully.
Jan 26 13:43:09 np0005596062 podman[262190]: 2026-01-26 18:43:09.922243516 +0000 UTC m=+0.227958077 container remove f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=happy_swartz, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 26 13:43:09 np0005596062 systemd[1]: libpod-conmon-f99291aeaeac49809b4140b0b0a899b57cd0b4a149e46d1a639ccc1d9153057c.scope: Deactivated successfully.
Jan 26 13:43:10 np0005596062 podman[262229]: 2026-01-26 18:43:10.137061439 +0000 UTC m=+0.055629473 container create 6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_kapitsa, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 13:43:10 np0005596062 systemd[1]: Started libpod-conmon-6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7.scope.
Jan 26 13:43:10 np0005596062 podman[262229]: 2026-01-26 18:43:10.108283537 +0000 UTC m=+0.026851651 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 26 13:43:10 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:43:10 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8be57034b0bc0c12414275967ce0ff490a753b3711368ad91a82c71d3165c5a1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 13:43:10 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8be57034b0bc0c12414275967ce0ff490a753b3711368ad91a82c71d3165c5a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 13:43:10 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8be57034b0bc0c12414275967ce0ff490a753b3711368ad91a82c71d3165c5a1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 13:43:10 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8be57034b0bc0c12414275967ce0ff490a753b3711368ad91a82c71d3165c5a1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 13:43:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:10.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:10 np0005596062 podman[262229]: 2026-01-26 18:43:10.23438096 +0000 UTC m=+0.152949134 container init 6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_kapitsa, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 26 13:43:10 np0005596062 podman[262229]: 2026-01-26 18:43:10.243100454 +0000 UTC m=+0.161668488 container start 6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 26 13:43:10 np0005596062 podman[262229]: 2026-01-26 18:43:10.247510802 +0000 UTC m=+0.166078856 container attach 6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_kapitsa, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]: [
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:    {
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "available": false,
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "ceph_device": false,
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "lsm_data": {},
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "lvs": [],
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "path": "/dev/sr0",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "rejected_reasons": [
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "Has a FileSystem",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "Insufficient space (<5GB)"
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        ],
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        "sys_api": {
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "actuators": null,
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "device_nodes": "sr0",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "devname": "sr0",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "human_readable_size": "482.00 KB",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "id_bus": "ata",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "model": "QEMU DVD-ROM",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "nr_requests": "2",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "parent": "/dev/sr0",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "partitions": {},
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "path": "/dev/sr0",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "removable": "1",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "rev": "2.5+",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "ro": "0",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "rotational": "1",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "sas_address": "",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "sas_device_handle": "",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "scheduler_mode": "mq-deadline",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "sectors": 0,
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "sectorsize": "2048",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "size": 493568.0,
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "support_discard": "2048",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "type": "disk",
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:            "vendor": "QEMU"
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:        }
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]:    }
Jan 26 13:43:11 np0005596062 elastic_kapitsa[262246]: ]
Jan 26 13:43:11 np0005596062 systemd[1]: libpod-6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7.scope: Deactivated successfully.
Jan 26 13:43:11 np0005596062 systemd[1]: libpod-6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7.scope: Consumed 1.154s CPU time.
Jan 26 13:43:11 np0005596062 podman[262229]: 2026-01-26 18:43:11.420096531 +0000 UTC m=+1.338664555 container died 6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True)
Jan 26 13:43:11 np0005596062 systemd[1]: var-lib-containers-storage-overlay-8be57034b0bc0c12414275967ce0ff490a753b3711368ad91a82c71d3165c5a1-merged.mount: Deactivated successfully.
Jan 26 13:43:11 np0005596062 podman[262229]: 2026-01-26 18:43:11.483470001 +0000 UTC m=+1.402038025 container remove 6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 13:43:11 np0005596062 systemd[1]: libpod-conmon-6c27ed3a03729d4952adfc67dcae2986967120e55dcd22d5161d62ac2e4fb1c7.scope: Deactivated successfully.
Jan 26 13:43:11 np0005596062 nova_compute[227313]: 2026-01-26 18:43:11.490 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:11.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:12.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:43:12 np0005596062 nova_compute[227313]: 2026-01-26 18:43:12.544 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:13.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:14 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:43:14.214 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:43:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:14.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:15.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:16.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:16 np0005596062 nova_compute[227313]: 2026-01-26 18:43:16.495 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.732891) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452996732952, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 403, "num_deletes": 257, "total_data_size": 424515, "memory_usage": 433128, "flush_reason": "Manual Compaction"}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452996737750, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 269128, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49481, "largest_seqno": 49879, "table_properties": {"data_size": 266758, "index_size": 470, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5817, "raw_average_key_size": 18, "raw_value_size": 261945, "raw_average_value_size": 823, "num_data_blocks": 20, "num_entries": 318, "num_filter_entries": 318, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452986, "oldest_key_time": 1769452986, "file_creation_time": 1769452996, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 4907 microseconds, and 2424 cpu microseconds.
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.737800) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 269128 bytes OK
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.737821) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.740064) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.740087) EVENT_LOG_v1 {"time_micros": 1769452996740080, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.740105) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 421869, prev total WAL file size 421869, number of live WAL files 2.
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.740583) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353038' seq:72057594037927935, type:22 .. '6C6F676D0031373631' seq:0, type:0; will stop at (end)
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(262KB)], [96(10MB)]
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452996740657, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 10774288, "oldest_snapshot_seqno": -1}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 6902 keys, 10648933 bytes, temperature: kUnknown
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452996850886, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 10648933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10604287, "index_size": 26294, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17285, "raw_key_size": 179726, "raw_average_key_size": 26, "raw_value_size": 10481736, "raw_average_value_size": 1518, "num_data_blocks": 1041, "num_entries": 6902, "num_filter_entries": 6902, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769452996, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.851125) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 10648933 bytes
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.852932) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 97.7 rd, 96.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 10.0 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(79.6) write-amplify(39.6) OK, records in: 7428, records dropped: 526 output_compression: NoCompression
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.852949) EVENT_LOG_v1 {"time_micros": 1769452996852941, "job": 60, "event": "compaction_finished", "compaction_time_micros": 110305, "compaction_time_cpu_micros": 55942, "output_level": 6, "num_output_files": 1, "total_output_size": 10648933, "num_input_records": 7428, "num_output_records": 6902, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452996853191, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769452996855092, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.740500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.855192) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.855198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.855201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.855204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:16 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:43:16.855206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:43:17 np0005596062 nova_compute[227313]: 2026-01-26 18:43:17.546 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:17.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:43:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:18.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.073 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.073 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.074 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.074 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.074 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:43:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:43:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1813579769' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.533 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:43:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:19.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.725 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.727 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4722MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.727 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.728 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.974 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:43:19 np0005596062 nova_compute[227313]: 2026-01-26 18:43:19.975 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.030 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.093 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.094 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.109 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.130 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.147 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:43:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:20.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:43:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2020118190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.638 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.644 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.774 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.777 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:43:20 np0005596062 nova_compute[227313]: 2026-01-26 18:43:20.778 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:43:21 np0005596062 nova_compute[227313]: 2026-01-26 18:43:21.496 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:21.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:22.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:22 np0005596062 nova_compute[227313]: 2026-01-26 18:43:22.548 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:23.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:24.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:25.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:43:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:26.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:43:26 np0005596062 nova_compute[227313]: 2026-01-26 18:43:26.499 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:26 np0005596062 podman[263574]: 2026-01-26 18:43:26.879441386 +0000 UTC m=+0.081383105 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:43:27 np0005596062 nova_compute[227313]: 2026-01-26 18:43:27.552 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:27.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:28.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:29.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:30.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.781 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.781 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.781 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.809 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.810 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.810 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.810 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.810 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:30 np0005596062 nova_compute[227313]: 2026-01-26 18:43:30.810 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:43:31 np0005596062 nova_compute[227313]: 2026-01-26 18:43:31.501 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:31.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:32 np0005596062 nova_compute[227313]: 2026-01-26 18:43:32.074 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:32.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:32 np0005596062 nova_compute[227313]: 2026-01-26 18:43:32.552 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:33.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:34 np0005596062 nova_compute[227313]: 2026-01-26 18:43:34.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:34.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:35 np0005596062 nova_compute[227313]: 2026-01-26 18:43:35.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:43:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:35.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:36.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:36 np0005596062 nova_compute[227313]: 2026-01-26 18:43:36.503 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:37 np0005596062 nova_compute[227313]: 2026-01-26 18:43:37.555 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:37.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:37 np0005596062 podman[263601]: 2026-01-26 18:43:37.886517306 +0000 UTC m=+0.101667969 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 13:43:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:43:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:38.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:43:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:39.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:40.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:43:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/577336087' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:43:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:43:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/577336087' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:43:41 np0005596062 nova_compute[227313]: 2026-01-26 18:43:41.505 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:41.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:42.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:42 np0005596062 nova_compute[227313]: 2026-01-26 18:43:42.557 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:43.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:44.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:45.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:46.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:46 np0005596062 nova_compute[227313]: 2026-01-26 18:43:46.506 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:47 np0005596062 nova_compute[227313]: 2026-01-26 18:43:47.559 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:47.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:43:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:48.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:43:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:49.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:50.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:51 np0005596062 nova_compute[227313]: 2026-01-26 18:43:51.508 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:51.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:52.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:52 np0005596062 nova_compute[227313]: 2026-01-26 18:43:52.562 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:53.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:54.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:55.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:56.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:56 np0005596062 nova_compute[227313]: 2026-01-26 18:43:56.510 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:57 np0005596062 nova_compute[227313]: 2026-01-26 18:43:57.565 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:43:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:43:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:57.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:57 np0005596062 podman[263689]: 2026-01-26 18:43:57.886983073 +0000 UTC m=+0.083783759 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:43:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:43:58.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:43:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:43:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:43:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:43:59.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:00.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:01 np0005596062 nova_compute[227313]: 2026-01-26 18:44:01.513 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:01.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:44:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:02.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:44:02 np0005596062 nova_compute[227313]: 2026-01-26 18:44:02.567 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:03.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:04.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:05.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:44:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:06.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:44:06 np0005596062 nova_compute[227313]: 2026-01-26 18:44:06.557 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:07 np0005596062 nova_compute[227313]: 2026-01-26 18:44:07.569 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:07.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:08.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:08 np0005596062 podman[263764]: 2026-01-26 18:44:08.882063542 +0000 UTC m=+0.093679544 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:44:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:09.195 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:09.196 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:09.196 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:09.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:10.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:10.593 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:44:10 np0005596062 nova_compute[227313]: 2026-01-26 18:44:10.593 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:10 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:10.594 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:44:11 np0005596062 nova_compute[227313]: 2026-01-26 18:44:11.559 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:11.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:12.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:12 np0005596062 nova_compute[227313]: 2026-01-26 18:44:12.571 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:13.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:14.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:15.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:16.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:16 np0005596062 nova_compute[227313]: 2026-01-26 18:44:16.561 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:16 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:16.596 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:17 np0005596062 nova_compute[227313]: 2026-01-26 18:44:17.572 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:17.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:18 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:18.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:44:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:44:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:19.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:20.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.078 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.078 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.079 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.079 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.079 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:44:21 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3366500787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.610 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.628 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.548s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.772 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.773 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4725MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.773 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.774 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:21.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.840 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.841 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:44:21 np0005596062 nova_compute[227313]: 2026-01-26 18:44:21.863 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:44:22 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/796963437' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:44:22 np0005596062 nova_compute[227313]: 2026-01-26 18:44:22.289 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:22 np0005596062 nova_compute[227313]: 2026-01-26 18:44:22.294 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:44:22 np0005596062 nova_compute[227313]: 2026-01-26 18:44:22.307 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:44:22 np0005596062 nova_compute[227313]: 2026-01-26 18:44:22.309 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:44:22 np0005596062 nova_compute[227313]: 2026-01-26 18:44:22.309 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:22.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:22 np0005596062 nova_compute[227313]: 2026-01-26 18:44:22.576 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:23.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:24.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:25.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:25 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:25 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:44:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:44:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:26.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:44:26 np0005596062 nova_compute[227313]: 2026-01-26 18:44:26.612 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:27 np0005596062 nova_compute[227313]: 2026-01-26 18:44:27.583 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:27.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:28.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:28 np0005596062 podman[264076]: 2026-01-26 18:44:28.8836971 +0000 UTC m=+0.080811409 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 13:44:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:29.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:30.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.309 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.398 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.398 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.399 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.571 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.572 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.573 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.573 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.574 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:44:31 np0005596062 nova_compute[227313]: 2026-01-26 18:44:31.661 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:44:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:31.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:44:32 np0005596062 nova_compute[227313]: 2026-01-26 18:44:32.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:32 np0005596062 nova_compute[227313]: 2026-01-26 18:44:32.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:32.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:32 np0005596062 nova_compute[227313]: 2026-01-26 18:44:32.586 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:33.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:34.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:35 np0005596062 nova_compute[227313]: 2026-01-26 18:44:35.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:35.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:36 np0005596062 nova_compute[227313]: 2026-01-26 18:44:36.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:44:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:36.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:36 np0005596062 nova_compute[227313]: 2026-01-26 18:44:36.664 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:37 np0005596062 nova_compute[227313]: 2026-01-26 18:44:37.589 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:37.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:38.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:39.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:39 np0005596062 podman[264100]: 2026-01-26 18:44:39.891434679 +0000 UTC m=+0.103022785 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:44:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:40.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:41 np0005596062 nova_compute[227313]: 2026-01-26 18:44:41.667 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:41.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.050 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.050 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.341 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 13:44:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:42.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.419 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.419 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.425 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.425 227317 INFO nova.compute.claims [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.592 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:42 np0005596062 nova_compute[227313]: 2026-01-26 18:44:42.830 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:44:43 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2120931989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.290 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.295 227317 DEBUG nova.compute.provider_tree [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.431 227317 DEBUG nova.scheduler.client.report [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.473 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.474 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.647 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.648 227317 DEBUG nova.network.neutron [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.763 227317 INFO nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 13:44:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:43.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:43 np0005596062 nova_compute[227313]: 2026-01-26 18:44:43.876 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 13:44:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:44.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.365 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.366 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.366 227317 INFO nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Creating image(s)#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.399 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.434 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.465 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.468 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.558 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.559 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "0e27310cde9db7031eb6052434134c1283ddf216" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.559 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.559 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0e27310cde9db7031eb6052434134c1283ddf216" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.587 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.590 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:44 np0005596062 nova_compute[227313]: 2026-01-26 18:44:44.612 227317 DEBUG nova.policy [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab4f5e4c36dd409fa5bb8295edb56a1e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f6d1f7624fe846da936bdf952d988dca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.309 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/0e27310cde9db7031eb6052434134c1283ddf216 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.719s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.394 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] resizing rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.506 227317 DEBUG nova.objects.instance [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lazy-loading 'migration_context' on Instance uuid 0da4d154-1c5d-435f-bc88-07c4b9e6f79b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.558 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.559 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Ensure instance console log exists: /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.559 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.560 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:45 np0005596062 nova_compute[227313]: 2026-01-26 18:44:45.560 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:45.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:46.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:46 np0005596062 nova_compute[227313]: 2026-01-26 18:44:46.393 227317 DEBUG nova.network.neutron [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Successfully created port: 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 13:44:46 np0005596062 nova_compute[227313]: 2026-01-26 18:44:46.668 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:47 np0005596062 nova_compute[227313]: 2026-01-26 18:44:47.594 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:47.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:47 np0005596062 nova_compute[227313]: 2026-01-26 18:44:47.935 227317 DEBUG nova.network.neutron [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Successfully updated port: 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.054 227317 DEBUG nova.compute.manager [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-changed-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.054 227317 DEBUG nova.compute.manager [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Refreshing instance network info cache due to event network-changed-2832c6c0-b897-4481-8a2e-b13ebd13fdf7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.054 227317 DEBUG oslo_concurrency.lockutils [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.055 227317 DEBUG oslo_concurrency.lockutils [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.055 227317 DEBUG nova.network.neutron [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Refreshing network info cache for port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.066 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:44:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:48.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.554 227317 DEBUG nova.network.neutron [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.853 227317 DEBUG nova.network.neutron [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.937 227317 DEBUG oslo_concurrency.lockutils [req-4f7f02f5-182c-4676-bff9-c720ba8c4f36 req-27129fea-d9a0-4deb-96e5-4f200910e0b7 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.938 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquired lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:44:48 np0005596062 nova_compute[227313]: 2026-01-26 18:44:48.938 227317 DEBUG nova.network.neutron [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 13:44:49 np0005596062 nova_compute[227313]: 2026-01-26 18:44:49.109 227317 DEBUG nova.network.neutron [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 13:44:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:49.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.083 227317 DEBUG nova.network.neutron [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.118 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Releasing lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.119 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Instance network_info: |[{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.123 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Start _get_guest_xml network_info=[{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_format': None, 'image_id': '57de5960-c1c5-4cfa-af34-8f58cf25f585'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.129 227317 WARNING nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.137 227317 DEBUG nova.virt.libvirt.host [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.138 227317 DEBUG nova.virt.libvirt.host [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.143 227317 DEBUG nova.virt.libvirt.host [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.144 227317 DEBUG nova.virt.libvirt.host [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.145 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.146 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T18:05:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c19d349c-ad8f-4453-bd9e-1248725b13ed',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T18:05:23Z,direct_url=<?>,disk_format='qcow2',id=57de5960-c1c5-4cfa-af34-8f58cf25f585,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ce9c2caf475c4ad29ab1e03bc8886f7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T18:05:28Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.147 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.147 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.147 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.148 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.148 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.149 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.149 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.150 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.150 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.150 227317 DEBUG nova.virt.hardware [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.155 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:50.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:50 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:44:50 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4284095622' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.586 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.611 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:50 np0005596062 nova_compute[227313]: 2026-01-26 18:44:50.614 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 26 13:44:51 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3564213158' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.051 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.053 227317 DEBUG nova.virt.libvirt.vif [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:44:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1885647578',display_name='tempest-TestSnapshotPattern-server-1885647578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1885647578',id=29,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCAYmVs+UW2XJsRtBIbdZbz28ZVdt7AiOxfdjjSsjnkL6p6XTA2fhA867rw0hqdCm+lPM0yPV4ff9dVLHk7OAzo0CgTYKG/4Lv9EiKZeI+OUhOQtFQJysHTnBrgkAFHfCQ==',key_name='tempest-TestSnapshotPattern-1728523139',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f6d1f7624fe846da936bdf952d988dca',ramdisk_id='',reservation_id='r-xy45ksu3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-612206442',owner_user_name='tempest-TestSnapshotPattern-612206442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:44:43Z,user_data=None,user_id='ab4f5e4c36dd409fa5bb8295edb56a1e',uuid=0da4d154-1c5d-435f-bc88-07c4b9e6f79b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.054 227317 DEBUG nova.network.os_vif_util [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Converting VIF {"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.055 227317 DEBUG nova.network.os_vif_util [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.056 227317 DEBUG nova.objects.instance [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lazy-loading 'pci_devices' on Instance uuid 0da4d154-1c5d-435f-bc88-07c4b9e6f79b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.166 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] End _get_guest_xml xml=<domain type="kvm">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <uuid>0da4d154-1c5d-435f-bc88-07c4b9e6f79b</uuid>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <name>instance-0000001d</name>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <memory>131072</memory>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <vcpu>1</vcpu>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <metadata>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:name>tempest-TestSnapshotPattern-server-1885647578</nova:name>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:creationTime>2026-01-26 18:44:50</nova:creationTime>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:flavor name="m1.nano">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:memory>128</nova:memory>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:disk>1</nova:disk>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:swap>0</nova:swap>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:vcpus>1</nova:vcpus>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </nova:flavor>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:owner>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:user uuid="ab4f5e4c36dd409fa5bb8295edb56a1e">tempest-TestSnapshotPattern-612206442-project-member</nova:user>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:project uuid="f6d1f7624fe846da936bdf952d988dca">tempest-TestSnapshotPattern-612206442</nova:project>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </nova:owner>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:root type="image" uuid="57de5960-c1c5-4cfa-af34-8f58cf25f585"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <nova:ports>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <nova:port uuid="2832c6c0-b897-4481-8a2e-b13ebd13fdf7">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        </nova:port>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </nova:ports>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </nova:instance>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </metadata>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <sysinfo type="smbios">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <system>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <entry name="manufacturer">RDO</entry>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <entry name="product">OpenStack Compute</entry>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <entry name="serial">0da4d154-1c5d-435f-bc88-07c4b9e6f79b</entry>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <entry name="uuid">0da4d154-1c5d-435f-bc88-07c4b9e6f79b</entry>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <entry name="family">Virtual Machine</entry>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </system>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </sysinfo>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <os>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <boot dev="hd"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <smbios mode="sysinfo"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </os>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <features>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <acpi/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <apic/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <vmcoreinfo/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </features>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <clock offset="utc">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <timer name="hpet" present="no"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </clock>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <cpu mode="custom" match="exact">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <model>Nehalem</model>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </cpu>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  <devices>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <disk type="network" device="disk">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <target dev="vda" bus="virtio"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <disk type="network" device="cdrom">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <driver type="raw" cache="none"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <source protocol="rbd" name="vms/0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk.config">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <host name="192.168.122.100" port="6789"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <host name="192.168.122.102" port="6789"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <host name="192.168.122.101" port="6789"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </source>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <auth username="openstack">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:        <secret type="ceph" uuid="d4cd1917-5876-51b6-bc64-65a16199754d"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      </auth>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <target dev="sda" bus="sata"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </disk>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <interface type="ethernet">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <mac address="fa:16:3e:68:80:6e"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <mtu size="1442"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <target dev="tap2832c6c0-b8"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </interface>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <serial type="pty">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <log file="/var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/console.log" append="off"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </serial>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <video>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <model type="virtio"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </video>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <input type="tablet" bus="usb"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <rng model="virtio">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <backend model="random">/dev/urandom</backend>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </rng>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <controller type="usb" index="0"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    <memballoon model="virtio">
Jan 26 13:44:51 np0005596062 nova_compute[227313]:      <stats period="10"/>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:    </memballoon>
Jan 26 13:44:51 np0005596062 nova_compute[227313]:  </devices>
Jan 26 13:44:51 np0005596062 nova_compute[227313]: </domain>
Jan 26 13:44:51 np0005596062 nova_compute[227313]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.168 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Preparing to wait for external event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.168 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.168 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.169 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.169 227317 DEBUG nova.virt.libvirt.vif [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T18:44:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1885647578',display_name='tempest-TestSnapshotPattern-server-1885647578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1885647578',id=29,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCAYmVs+UW2XJsRtBIbdZbz28ZVdt7AiOxfdjjSsjnkL6p6XTA2fhA867rw0hqdCm+lPM0yPV4ff9dVLHk7OAzo0CgTYKG/4Lv9EiKZeI+OUhOQtFQJysHTnBrgkAFHfCQ==',key_name='tempest-TestSnapshotPattern-1728523139',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f6d1f7624fe846da936bdf952d988dca',ramdisk_id='',reservation_id='r-xy45ksu3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-612206442',owner_user_name='tempest-TestSnapshotPattern-612206442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T18:44:43Z,user_data=None,user_id='ab4f5e4c36dd409fa5bb8295edb56a1e',uuid=0da4d154-1c5d-435f-bc88-07c4b9e6f79b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.170 227317 DEBUG nova.network.os_vif_util [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Converting VIF {"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.171 227317 DEBUG nova.network.os_vif_util [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.171 227317 DEBUG os_vif [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.172 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.172 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.173 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.178 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.178 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2832c6c0-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.179 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2832c6c0-b8, col_values=(('external_ids', {'iface-id': '2832c6c0-b897-4481-8a2e-b13ebd13fdf7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:80:6e', 'vm-uuid': '0da4d154-1c5d-435f-bc88-07c4b9e6f79b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.208 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:51 np0005596062 NetworkManager[48993]: <info>  [1769453091.2094] manager: (tap2832c6c0-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.212 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.214 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.215 227317 INFO os_vif [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8')#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.297 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.297 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.297 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] No VIF found with MAC fa:16:3e:68:80:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.298 227317 INFO nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Using config drive#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.323 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.692 227317 INFO nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Creating config drive at /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/disk.config#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.696 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzaqney3e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.826 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzaqney3e" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:51.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.875 227317 DEBUG nova.storage.rbd_utils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] rbd image 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 13:44:51 np0005596062 nova_compute[227313]: 2026-01-26 18:44:51.881 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/disk.config 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:44:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:52.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:52 np0005596062 nova_compute[227313]: 2026-01-26 18:44:52.596 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.080 227317 DEBUG oslo_concurrency.processutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/disk.config 0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.081 227317 INFO nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Deleting local config drive /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b/disk.config because it was imported into RBD.#033[00m
Jan 26 13:44:53 np0005596062 kernel: tap2832c6c0-b8: entered promiscuous mode
Jan 26 13:44:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:44:53Z|00237|binding|INFO|Claiming lport 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 for this chassis.
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.141 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:44:53Z|00238|binding|INFO|2832c6c0-b897-4481-8a2e-b13ebd13fdf7: Claiming fa:16:3e:68:80:6e 10.100.0.13
Jan 26 13:44:53 np0005596062 NetworkManager[48993]: <info>  [1769453093.1444] manager: (tap2832c6c0-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.167 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:80:6e 10.100.0.13'], port_security=['fa:16:3e:68:80:6e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0da4d154-1c5d-435f-bc88-07c4b9e6f79b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c92bd0c-b67a-4232-823a-830d97d73785', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6d1f7624fe846da936bdf952d988dca', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24e47fcc-5b62-4556-b880-35104e4b6ec2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4ce7d98-bbfb-4f37-af96-1528ef95ee96, chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=2832c6c0-b897-4481-8a2e-b13ebd13fdf7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.170 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 in datapath 3c92bd0c-b67a-4232-823a-830d97d73785 bound to our chassis#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.172 143929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c92bd0c-b67a-4232-823a-830d97d73785#033[00m
Jan 26 13:44:53 np0005596062 systemd-machined[195380]: New machine qemu-22-instance-0000001d.
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.186 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[17ec3f2e-a8d7-4fc4-bfcf-7c139dec31d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.187 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c92bd0c-b1 in ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.189 230329 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c92bd0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.189 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[514b235f-da7f-46c9-89e8-7db8bb5622f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.191 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[c20266a0-ff5b-4a97-b5b7-2fb31a2f77ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.205 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[981565cf-1a2d-4af2-a97a-cc1054caf36a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Jan 26 13:44:53 np0005596062 systemd-udevd[264510]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.235 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[72aa1d81-cd13-4c79-aff6-f1f9e20c7bff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.245 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 NetworkManager[48993]: <info>  [1769453093.2519] device (tap2832c6c0-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 13:44:53 np0005596062 NetworkManager[48993]: <info>  [1769453093.2528] device (tap2832c6c0-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 13:44:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:44:53Z|00239|binding|INFO|Setting lport 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 ovn-installed in OVS
Jan 26 13:44:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:44:53Z|00240|binding|INFO|Setting lport 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 up in Southbound
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.257 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.268 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[da0dc7c4-8d47-4a10-92bb-138adb836915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 NetworkManager[48993]: <info>  [1769453093.2747] manager: (tap3c92bd0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/116)
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.274 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[10fb2d04-ff6f-40b7-9d13-b3e6a3c33e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.315 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[b63fa47f-fa93-4537-a198-dd5564abf00f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.318 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e7304f-f390-4b76-884d-64ee7b164cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 NetworkManager[48993]: <info>  [1769453093.3401] device (tap3c92bd0c-b0): carrier: link connected
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.346 230412 DEBUG oslo.privsep.daemon [-] privsep: reply[56fe22a4-c7ae-48b2-b83c-2ec3087713de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.365 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[e0815c95-66a6-4257-a3e6-ad0f5ef3e3c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c92bd0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:36:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685339, 'reachable_time': 41981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264540, 'error': None, 'target': 'ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.382 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[32241e0e-a624-41fc-8164-9bec6dc7e057]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:3654'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 685339, 'tstamp': 685339}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264541, 'error': None, 'target': 'ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.399 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7f752c-1b51-4bdf-a462-a28bf72fee74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c92bd0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:36:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685339, 'reachable_time': 41981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264542, 'error': None, 'target': 'ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.434 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[512066eb-0bc4-4903-9925-873e8b9d4657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.503 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b36a6be6-6d8b-41d1-94f1-f6f54c887563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.505 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c92bd0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.505 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.505 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c92bd0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.507 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 NetworkManager[48993]: <info>  [1769453093.5090] manager: (tap3c92bd0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 26 13:44:53 np0005596062 kernel: tap3c92bd0c-b0: entered promiscuous mode
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.513 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.516 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c92bd0c-b0, col_values=(('external_ids', {'iface-id': '694ebde7-9ee4-4b59-afb6-8479ba63b2ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.517 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 ovn_controller[133984]: 2026-01-26T18:44:53Z|00241|binding|INFO|Releasing lport 694ebde7-9ee4-4b59-afb6-8479ba63b2ad from this chassis (sb_readonly=0)
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.545 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.547 143929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c92bd0c-b67a-4232-823a-830d97d73785.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c92bd0c-b67a-4232-823a-830d97d73785.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.548 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[b01f0026-e226-4538-ac64-f7e0c1895016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.549 143929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: global
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    log         /dev/log local0 debug
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    log-tag     haproxy-metadata-proxy-3c92bd0c-b67a-4232-823a-830d97d73785
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    user        root
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    group       root
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    maxconn     1024
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    pidfile     /var/lib/neutron/external/pids/3c92bd0c-b67a-4232-823a-830d97d73785.pid.haproxy
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    daemon
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: defaults
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    log global
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    mode http
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    option httplog
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    option dontlognull
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    option http-server-close
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    option forwardfor
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    retries                 3
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    timeout http-request    30s
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    timeout connect         30s
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    timeout client          32s
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    timeout server          32s
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    timeout http-keep-alive 30s
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: listen listener
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    bind 169.254.169.254:80
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]:    http-request add-header X-OVN-Network-ID 3c92bd0c-b67a-4232-823a-830d97d73785
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 13:44:53 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:44:53.551 143929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785', 'env', 'PROCESS_TAG=haproxy-3c92bd0c-b67a-4232-823a-830d97d73785', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c92bd0c-b67a-4232-823a-830d97d73785.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 13:44:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:53.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:53 np0005596062 podman[264592]: 2026-01-26 18:44:53.941723742 +0000 UTC m=+0.051256996 container create 22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.961 227317 DEBUG nova.compute.manager [req-fe742901-069c-43e3-bf1a-cd283dd94203 req-a5dc2ad8-3239-4741-a858-6a6eab97faea 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.963 227317 DEBUG oslo_concurrency.lockutils [req-fe742901-069c-43e3-bf1a-cd283dd94203 req-a5dc2ad8-3239-4741-a858-6a6eab97faea 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.964 227317 DEBUG oslo_concurrency.lockutils [req-fe742901-069c-43e3-bf1a-cd283dd94203 req-a5dc2ad8-3239-4741-a858-6a6eab97faea 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.964 227317 DEBUG oslo_concurrency.lockutils [req-fe742901-069c-43e3-bf1a-cd283dd94203 req-a5dc2ad8-3239-4741-a858-6a6eab97faea 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:53 np0005596062 nova_compute[227313]: 2026-01-26 18:44:53.965 227317 DEBUG nova.compute.manager [req-fe742901-069c-43e3-bf1a-cd283dd94203 req-a5dc2ad8-3239-4741-a858-6a6eab97faea 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Processing event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 13:44:53 np0005596062 systemd[1]: Started libpod-conmon-22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b.scope.
Jan 26 13:44:54 np0005596062 podman[264592]: 2026-01-26 18:44:53.913920406 +0000 UTC m=+0.023453680 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 26 13:44:54 np0005596062 systemd[1]: Started libcrun container.
Jan 26 13:44:54 np0005596062 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/143fea4befdd33d8b6a5657c7ead682ddc9b2ec1b3a82064ae863760dad98027/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 13:44:54 np0005596062 podman[264592]: 2026-01-26 18:44:54.057790136 +0000 UTC m=+0.167323420 container init 22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 26 13:44:54 np0005596062 podman[264592]: 2026-01-26 18:44:54.063772286 +0000 UTC m=+0.173305540 container start 22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 26 13:44:54 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [NOTICE]   (264634) : New worker (264637) forked
Jan 26 13:44:54 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [NOTICE]   (264634) : Loading success.
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.128 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769453094.1281815, 0da4d154-1c5d-435f-bc88-07c4b9e6f79b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.129 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] VM Started (Lifecycle Event)#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.131 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.135 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.138 227317 INFO nova.virt.libvirt.driver [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Instance spawned successfully.#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.138 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.151 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.157 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.160 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.160 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.160 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.161 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.161 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.161 227317 DEBUG nova.virt.libvirt.driver [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.321 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.322 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769453094.131096, 0da4d154-1c5d-435f-bc88-07c4b9e6f79b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.322 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] VM Paused (Lifecycle Event)#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.356 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:44:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:54.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.361 227317 INFO nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Took 10.00 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.362 227317 DEBUG nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.363 227317 DEBUG nova.virt.driver [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] Emitting event <LifecycleEvent: 1769453094.1341627, 0da4d154-1c5d-435f-bc88-07c4b9e6f79b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.363 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] VM Resumed (Lifecycle Event)#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.380 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.384 227317 DEBUG nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.498 227317 INFO nova.compute.manager [None req-0d2ede7f-8536-4f9e-8a11-8048837bf72e - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.553 227317 INFO nova.compute.manager [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Took 12.16 seconds to build instance.#033[00m
Jan 26 13:44:54 np0005596062 nova_compute[227313]: 2026-01-26 18:44:54.654 227317 DEBUG oslo_concurrency.lockutils [None req-1ca08952-1db2-4d04-907a-ed435c6f1098 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:44:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:55.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.056 227317 DEBUG nova.compute.manager [req-3230cee9-3bb6-47a8-b91e-cd31094713a9 req-2134b2cf-b95e-4c22-bd63-b6f0d9cba670 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.057 227317 DEBUG oslo_concurrency.lockutils [req-3230cee9-3bb6-47a8-b91e-cd31094713a9 req-2134b2cf-b95e-4c22-bd63-b6f0d9cba670 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.057 227317 DEBUG oslo_concurrency.lockutils [req-3230cee9-3bb6-47a8-b91e-cd31094713a9 req-2134b2cf-b95e-4c22-bd63-b6f0d9cba670 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.057 227317 DEBUG oslo_concurrency.lockutils [req-3230cee9-3bb6-47a8-b91e-cd31094713a9 req-2134b2cf-b95e-4c22-bd63-b6f0d9cba670 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.057 227317 DEBUG nova.compute.manager [req-3230cee9-3bb6-47a8-b91e-cd31094713a9 req-2134b2cf-b95e-4c22-bd63-b6f0d9cba670 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] No waiting events found dispatching network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.058 227317 WARNING nova.compute.manager [req-3230cee9-3bb6-47a8-b91e-cd31094713a9 req-2134b2cf-b95e-4c22-bd63-b6f0d9cba670 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received unexpected event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 for instance with vm_state active and task_state None.#033[00m
Jan 26 13:44:56 np0005596062 nova_compute[227313]: 2026-01-26 18:44:56.211 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:56.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:57 np0005596062 nova_compute[227313]: 2026-01-26 18:44:57.598 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:44:57 np0005596062 NetworkManager[48993]: <info>  [1769453097.7421] manager: (patch-br-int-to-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Jan 26 13:44:57 np0005596062 NetworkManager[48993]: <info>  [1769453097.7425] manager: (patch-provnet-7e8d8b01-8f69-4c2f-9ca3-c7f2a9ff632c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 26 13:44:57 np0005596062 nova_compute[227313]: 2026-01-26 18:44:57.739 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:57 np0005596062 nova_compute[227313]: 2026-01-26 18:44:57.836 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:57 np0005596062 ovn_controller[133984]: 2026-01-26T18:44:57Z|00242|binding|INFO|Releasing lport 694ebde7-9ee4-4b59-afb6-8479ba63b2ad from this chassis (sb_readonly=0)
Jan 26 13:44:57 np0005596062 nova_compute[227313]: 2026-01-26 18:44:57.843 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:44:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:57.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:58 np0005596062 nova_compute[227313]: 2026-01-26 18:44:58.063 227317 DEBUG nova.compute.manager [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-changed-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:44:58 np0005596062 nova_compute[227313]: 2026-01-26 18:44:58.064 227317 DEBUG nova.compute.manager [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Refreshing instance network info cache due to event network-changed-2832c6c0-b897-4481-8a2e-b13ebd13fdf7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:44:58 np0005596062 nova_compute[227313]: 2026-01-26 18:44:58.064 227317 DEBUG oslo_concurrency.lockutils [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:44:58 np0005596062 nova_compute[227313]: 2026-01-26 18:44:58.064 227317 DEBUG oslo_concurrency.lockutils [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:44:58 np0005596062 nova_compute[227313]: 2026-01-26 18:44:58.064 227317 DEBUG nova.network.neutron [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Refreshing network info cache for port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:44:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:44:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:44:58.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:44:59 np0005596062 nova_compute[227313]: 2026-01-26 18:44:59.233 227317 DEBUG nova.network.neutron [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updated VIF entry in instance network info cache for port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:44:59 np0005596062 nova_compute[227313]: 2026-01-26 18:44:59.235 227317 DEBUG nova.network.neutron [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:44:59 np0005596062 nova_compute[227313]: 2026-01-26 18:44:59.254 227317 DEBUG oslo_concurrency.lockutils [req-59966ce9-0a8e-4211-bcd8-f6ef538e334a req-b4d8ea33-977b-4d68-9ec8-3b7fdeb11fe3 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:44:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:44:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:44:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:44:59.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:44:59 np0005596062 podman[264650]: 2026-01-26 18:44:59.881933128 +0000 UTC m=+0.078754144 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 13:45:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:00.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:01 np0005596062 nova_compute[227313]: 2026-01-26 18:45:01.215 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:01.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:02.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:02 np0005596062 nova_compute[227313]: 2026-01-26 18:45:02.600 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:03.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:04.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:05.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:06 np0005596062 nova_compute[227313]: 2026-01-26 18:45:06.219 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:06.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:07 np0005596062 nova_compute[227313]: 2026-01-26 18:45:07.601 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:07.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:08.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:08 np0005596062 ovn_controller[133984]: 2026-01-26T18:45:08Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:80:6e 10.100.0.13
Jan 26 13:45:08 np0005596062 ovn_controller[133984]: 2026-01-26T18:45:08Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:80:6e 10.100.0.13
Jan 26 13:45:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:45:09.197 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:45:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:45:09.198 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:45:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:45:09.199 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:45:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:09.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:10.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:10 np0005596062 podman[264728]: 2026-01-26 18:45:10.874084877 +0000 UTC m=+0.081206190 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 13:45:11 np0005596062 nova_compute[227313]: 2026-01-26 18:45:11.220 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:11.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:12.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:12 np0005596062 nova_compute[227313]: 2026-01-26 18:45:12.603 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:13.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:14.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:15.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:16 np0005596062 nova_compute[227313]: 2026-01-26 18:45:16.223 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:16.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:17 np0005596062 nova_compute[227313]: 2026-01-26 18:45:17.605 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:17.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:18.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:19.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:20.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.095 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.096 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.096 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.227 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:45:21 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3077742346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:45:21 np0005596062 nova_compute[227313]: 2026-01-26 18:45:21.508 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:45:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:21.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.269 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.269 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:45:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:45:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:22.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.413 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.414 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4513MB free_disk=20.9427490234375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.414 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.414 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.593 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 0da4d154-1c5d-435f-bc88-07c4b9e6f79b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.593 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.596 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.608 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:22 np0005596062 nova_compute[227313]: 2026-01-26 18:45:22.631 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:45:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:45:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3946748615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:45:23 np0005596062 nova_compute[227313]: 2026-01-26 18:45:23.032 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:45:23 np0005596062 nova_compute[227313]: 2026-01-26 18:45:23.037 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:45:23 np0005596062 nova_compute[227313]: 2026-01-26 18:45:23.369 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:45:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:23.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:24 np0005596062 nova_compute[227313]: 2026-01-26 18:45:24.211 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:45:24 np0005596062 nova_compute[227313]: 2026-01-26 18:45:24.211 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:45:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:24.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:25.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:26 np0005596062 nova_compute[227313]: 2026-01-26 18:45:26.230 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:26.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:45:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:45:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 13:45:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 13:45:27 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:45:27 np0005596062 nova_compute[227313]: 2026-01-26 18:45:27.611 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:27 np0005596062 ovn_controller[133984]: 2026-01-26T18:45:27Z|00243|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Jan 26 13:45:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:27.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:45:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:45:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:28.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:29.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:30.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:30 np0005596062 podman[265108]: 2026-01-26 18:45:30.850416508 +0000 UTC m=+0.058813359 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 13:45:31 np0005596062 nova_compute[227313]: 2026-01-26 18:45:31.233 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:31 np0005596062 nova_compute[227313]: 2026-01-26 18:45:31.537 227317 DEBUG nova.compute.manager [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:45:31 np0005596062 nova_compute[227313]: 2026-01-26 18:45:31.577 227317 INFO nova.compute.manager [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] instance snapshotting#033[00m
Jan 26 13:45:31 np0005596062 nova_compute[227313]: 2026-01-26 18:45:31.807 227317 INFO nova.virt.libvirt.driver [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Beginning live snapshot process#033[00m
Jan 26 13:45:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:31.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:31 np0005596062 nova_compute[227313]: 2026-01-26 18:45:31.953 227317 DEBUG nova.virt.libvirt.imagebackend [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] No parent info for 57de5960-c1c5-4cfa-af34-8f58cf25f585; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 26 13:45:32 np0005596062 nova_compute[227313]: 2026-01-26 18:45:32.190 227317 DEBUG nova.storage.rbd_utils [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] creating snapshot(63356b265e1c4afbb7c6157b1296757b) on rbd image(0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:45:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:32.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:32 np0005596062 nova_compute[227313]: 2026-01-26 18:45:32.613 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.103 227317 DEBUG nova.storage.rbd_utils [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] cloning vms/0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk@63356b265e1c4afbb7c6157b1296757b to images/78a38f51-2188-4186-ba53-2edab9be0ff2 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.212 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.212 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.212 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.213 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.237 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.238 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.238 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.238 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0da4d154-1c5d-435f-bc88-07c4b9e6f79b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:45:33 np0005596062 nova_compute[227313]: 2026-01-26 18:45:33.324 227317 DEBUG nova.storage.rbd_utils [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] flattening images/78a38f51-2188-4186-ba53-2edab9be0ff2 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 26 13:45:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:45:33 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:45:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:33.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.083 227317 DEBUG nova.storage.rbd_utils [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] removing snapshot(63356b265e1c4afbb7c6157b1296757b) on rbd image(0da4d154-1c5d-435f-bc88-07c4b9e6f79b_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 26 13:45:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:34.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:34 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.746 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.781 227317 DEBUG nova.storage.rbd_utils [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] creating snapshot(snap) on rbd image(78a38f51-2188-4186-ba53-2edab9be0ff2) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.920 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.921 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.921 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.921 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.922 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.922 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:34 np0005596062 nova_compute[227313]: 2026-01-26 18:45:34.922 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:45:35 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 26 13:45:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:35.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:36 np0005596062 nova_compute[227313]: 2026-01-26 18:45:36.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:36 np0005596062 nova_compute[227313]: 2026-01-26 18:45:36.236 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:36.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:37 np0005596062 nova_compute[227313]: 2026-01-26 18:45:37.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:45:37 np0005596062 nova_compute[227313]: 2026-01-26 18:45:37.310 227317 INFO nova.virt.libvirt.driver [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Snapshot image upload complete#033[00m
Jan 26 13:45:37 np0005596062 nova_compute[227313]: 2026-01-26 18:45:37.311 227317 INFO nova.compute.manager [None req-f6d34dcd-62e6-4add-b6e6-3baa723b4df3 ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Took 5.73 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 26 13:45:37 np0005596062 nova_compute[227313]: 2026-01-26 18:45:37.615 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:37.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:38.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:39.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:45:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2018825172' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:45:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:45:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2018825172' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:45:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:40.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:41 np0005596062 nova_compute[227313]: 2026-01-26 18:45:41.240 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 26 13:45:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:41.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:41 np0005596062 podman[265324]: 2026-01-26 18:45:41.938868432 +0000 UTC m=+0.140953633 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:45:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:42.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:42 np0005596062 nova_compute[227313]: 2026-01-26 18:45:42.640 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:43.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:43 np0005596062 nova_compute[227313]: 2026-01-26 18:45:43.986 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:45:43.988 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:45:43 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:45:43.989 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:45:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:44.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:45.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:46 np0005596062 nova_compute[227313]: 2026-01-26 18:45:46.275 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:46.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:47 np0005596062 nova_compute[227313]: 2026-01-26 18:45:47.642 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:48.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:49.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:49 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:45:49.992 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:45:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:50.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:51 np0005596062 nova_compute[227313]: 2026-01-26 18:45:51.279 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:45:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:51.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:45:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:52.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:52 np0005596062 nova_compute[227313]: 2026-01-26 18:45:52.644 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:53.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:54.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:55.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:56 np0005596062 nova_compute[227313]: 2026-01-26 18:45:56.282 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:45:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:56.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:45:57 np0005596062 nova_compute[227313]: 2026-01-26 18:45:57.646 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:45:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:45:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:57.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:45:58.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:45:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:45:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:45:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:45:59.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:00.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:01 np0005596062 nova_compute[227313]: 2026-01-26 18:46:01.316 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:02.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:02 np0005596062 podman[265411]: 2026-01-26 18:46:02.146877567 +0000 UTC m=+0.070848762 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 13:46:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:46:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:02.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:46:02 np0005596062 nova_compute[227313]: 2026-01-26 18:46:02.648 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:04.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:04.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:06.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:06 np0005596062 nova_compute[227313]: 2026-01-26 18:46:06.319 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:06.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:07 np0005596062 nova_compute[227313]: 2026-01-26 18:46:07.649 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:08.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:08.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:09.198 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:09.199 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:09.200 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:10.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:10.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:11 np0005596062 nova_compute[227313]: 2026-01-26 18:46:11.322 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:12.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:12.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:12 np0005596062 nova_compute[227313]: 2026-01-26 18:46:12.651 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:12 np0005596062 podman[265488]: 2026-01-26 18:46:12.883363998 +0000 UTC m=+0.095523464 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 13:46:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:14.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:14.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:16.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:16 np0005596062 nova_compute[227313]: 2026-01-26 18:46:16.325 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:16.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:17 np0005596062 nova_compute[227313]: 2026-01-26 18:46:17.655 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:18.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:18.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:20.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:20.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.077 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.078 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.078 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.078 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.079 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.391 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:46:21 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1364705178' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.548 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.611 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.611 227317 DEBUG nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.753 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.754 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4490MB free_disk=20.936386108398438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.755 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.755 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.857 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Instance 0da4d154-1c5d-435f-bc88-07c4b9e6f79b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.858 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.858 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:46:21 np0005596062 nova_compute[227313]: 2026-01-26 18:46:21.887 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:46:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:22.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:46:22 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2231202616' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:46:22 np0005596062 nova_compute[227313]: 2026-01-26 18:46:22.301 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:46:22 np0005596062 nova_compute[227313]: 2026-01-26 18:46:22.307 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:46:22 np0005596062 nova_compute[227313]: 2026-01-26 18:46:22.336 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:46:22 np0005596062 nova_compute[227313]: 2026-01-26 18:46:22.340 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:46:22 np0005596062 nova_compute[227313]: 2026-01-26 18:46:22.341 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:22.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:22 np0005596062 nova_compute[227313]: 2026-01-26 18:46:22.657 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:24.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:24.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:25 np0005596062 ovn_controller[133984]: 2026-01-26T18:46:25Z|00244|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 26 13:46:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:26.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:26 np0005596062 nova_compute[227313]: 2026-01-26 18:46:26.413 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:26.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:27 np0005596062 nova_compute[227313]: 2026-01-26 18:46:27.659 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:28.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:28.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:30.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:30.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 26 13:46:31 np0005596062 nova_compute[227313]: 2026-01-26 18:46:31.416 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:32.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 26 13:46:32 np0005596062 nova_compute[227313]: 2026-01-26 18:46:32.342 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:32 np0005596062 nova_compute[227313]: 2026-01-26 18:46:32.342 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:46:32 np0005596062 nova_compute[227313]: 2026-01-26 18:46:32.342 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:46:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:32.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:32 np0005596062 nova_compute[227313]: 2026-01-26 18:46:32.661 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:32 np0005596062 podman[265619]: 2026-01-26 18:46:32.861233979 +0000 UTC m=+0.075607159 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 13:46:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 26 13:46:33 np0005596062 nova_compute[227313]: 2026-01-26 18:46:33.942 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:46:33 np0005596062 nova_compute[227313]: 2026-01-26 18:46:33.943 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquired lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:46:33 np0005596062 nova_compute[227313]: 2026-01-26 18:46:33.943 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 13:46:33 np0005596062 nova_compute[227313]: 2026-01-26 18:46:33.943 227317 DEBUG nova.objects.instance [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0da4d154-1c5d-435f-bc88-07c4b9e6f79b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:46:33 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 26 13:46:33 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:33.980478) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:46:33 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 26 13:46:33 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453193980534, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 2251, "num_deletes": 252, "total_data_size": 5480011, "memory_usage": 5555016, "flush_reason": "Manual Compaction"}
Jan 26 13:46:33 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453194008100, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 3581817, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49884, "largest_seqno": 52130, "table_properties": {"data_size": 3572633, "index_size": 5806, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 18815, "raw_average_key_size": 20, "raw_value_size": 3554278, "raw_average_value_size": 3859, "num_data_blocks": 253, "num_entries": 921, "num_filter_entries": 921, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769452996, "oldest_key_time": 1769452996, "file_creation_time": 1769453193, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 27744 microseconds, and 11821 cpu microseconds.
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.008186) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 3581817 bytes OK
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.008246) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.010790) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.010824) EVENT_LOG_v1 {"time_micros": 1769453194010815, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.010852) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 5470170, prev total WAL file size 5470170, number of live WAL files 2.
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.013355) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(3497KB)], [99(10MB)]
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453194013483, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 14230750, "oldest_snapshot_seqno": -1}
Jan 26 13:46:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:34.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7298 keys, 12139181 bytes, temperature: kUnknown
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453194105561, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 12139181, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12090208, "index_size": 29597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18309, "raw_key_size": 188661, "raw_average_key_size": 25, "raw_value_size": 11959131, "raw_average_value_size": 1638, "num_data_blocks": 1179, "num_entries": 7298, "num_filter_entries": 7298, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769453194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.105870) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 12139181 bytes
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.115058) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.4 rd, 131.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 10.2 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 7823, records dropped: 525 output_compression: NoCompression
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.115095) EVENT_LOG_v1 {"time_micros": 1769453194115082, "job": 62, "event": "compaction_finished", "compaction_time_micros": 92173, "compaction_time_cpu_micros": 48565, "output_level": 6, "num_output_files": 1, "total_output_size": 12139181, "num_input_records": 7823, "num_output_records": 7298, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453194116018, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453194117864, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.013170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.117967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.117973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.117974) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.117976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:46:34.117977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:46:34 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:46:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:34.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:36.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.327 227317 DEBUG nova.network.neutron [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.340 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Releasing lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.341 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.341 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.342 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.342 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.342 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.342 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.343 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:46:36 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 26 13:46:36 np0005596062 nova_compute[227313]: 2026-01-26 18:46:36.418 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:36.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:37 np0005596062 nova_compute[227313]: 2026-01-26 18:46:37.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:37 np0005596062 nova_compute[227313]: 2026-01-26 18:46:37.052 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:37 np0005596062 nova_compute[227313]: 2026-01-26 18:46:37.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:46:37 np0005596062 nova_compute[227313]: 2026-01-26 18:46:37.666 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:38.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:46:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:38.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:46:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:46:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:40.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:46:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:46:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2854418606' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:46:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:46:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2854418606' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:46:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:46:40 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:46:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:40.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:40 np0005596062 nova_compute[227313]: 2026-01-26 18:46:40.739 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:40.739 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:46:40 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:40.740 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:46:41 np0005596062 nova_compute[227313]: 2026-01-26 18:46:41.420 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:41 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:41.741 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:46:41 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 26 13:46:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:42.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:42.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:42 np0005596062 nova_compute[227313]: 2026-01-26 18:46:42.667 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 26 13:46:43 np0005596062 podman[265826]: 2026-01-26 18:46:43.863065408 +0000 UTC m=+0.073992996 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:46:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:46:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:44.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:46:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:44.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.169 227317 DEBUG nova.compute.manager [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-changed-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.169 227317 DEBUG nova.compute.manager [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Refreshing instance network info cache due to event network-changed-2832c6c0-b897-4481-8a2e-b13ebd13fdf7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.170 227317 DEBUG oslo_concurrency.lockutils [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.170 227317 DEBUG oslo_concurrency.lockutils [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquired lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.170 227317 DEBUG nova.network.neutron [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Refreshing network info cache for port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.258 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.259 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.259 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.260 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.260 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.262 227317 INFO nova.compute.manager [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Terminating instance#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.265 227317 DEBUG nova.compute.manager [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 13:46:45 np0005596062 kernel: tap2832c6c0-b8 (unregistering): left promiscuous mode
Jan 26 13:46:45 np0005596062 NetworkManager[48993]: <info>  [1769453205.3183] device (tap2832c6c0-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 13:46:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:46:45Z|00245|binding|INFO|Releasing lport 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 from this chassis (sb_readonly=0)
Jan 26 13:46:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:46:45Z|00246|binding|INFO|Setting lport 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 down in Southbound
Jan 26 13:46:45 np0005596062 ovn_controller[133984]: 2026-01-26T18:46:45Z|00247|binding|INFO|Removing iface tap2832c6c0-b8 ovn-installed in OVS
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.329 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.337 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:80:6e 10.100.0.13'], port_security=['fa:16:3e:68:80:6e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0da4d154-1c5d-435f-bc88-07c4b9e6f79b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c92bd0c-b67a-4232-823a-830d97d73785', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6d1f7624fe846da936bdf952d988dca', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24e47fcc-5b62-4556-b880-35104e4b6ec2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4ce7d98-bbfb-4f37-af96-1528ef95ee96, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>], logical_port=2832c6c0-b897-4481-8a2e-b13ebd13fdf7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f748f9b9910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.339 143929 INFO neutron.agent.ovn.metadata.agent [-] Port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7 in datapath 3c92bd0c-b67a-4232-823a-830d97d73785 unbound from our chassis#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.341 143929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c92bd0c-b67a-4232-823a-830d97d73785, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.344 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ade1be-c07e-493f-8a30-afa21b2245eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.344 143929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785 namespace which is not needed anymore#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.348 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 26 13:46:45 np0005596062 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 17.901s CPU time.
Jan 26 13:46:45 np0005596062 systemd-machined[195380]: Machine qemu-22-instance-0000001d terminated.
Jan 26 13:46:45 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [NOTICE]   (264634) : haproxy version is 2.8.14-c23fe91
Jan 26 13:46:45 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [NOTICE]   (264634) : path to executable is /usr/sbin/haproxy
Jan 26 13:46:45 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [WARNING]  (264634) : Exiting Master process...
Jan 26 13:46:45 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [WARNING]  (264634) : Exiting Master process...
Jan 26 13:46:45 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [ALERT]    (264634) : Current worker (264637) exited with code 143 (Terminated)
Jan 26 13:46:45 np0005596062 neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785[264625]: [WARNING]  (264634) : All workers exited. Exiting... (0)
Jan 26 13:46:45 np0005596062 systemd[1]: libpod-22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b.scope: Deactivated successfully.
Jan 26 13:46:45 np0005596062 podman[265927]: 2026-01-26 18:46:45.48493837 +0000 UTC m=+0.045088660 container died 22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.539 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.549 227317 INFO nova.virt.libvirt.driver [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Instance destroyed successfully.#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.550 227317 DEBUG nova.objects.instance [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lazy-loading 'resources' on Instance uuid 0da4d154-1c5d-435f-bc88-07c4b9e6f79b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 13:46:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b-userdata-shm.mount: Deactivated successfully.
Jan 26 13:46:45 np0005596062 systemd[1]: var-lib-containers-storage-overlay-143fea4befdd33d8b6a5657c7ead682ddc9b2ec1b3a82064ae863760dad98027-merged.mount: Deactivated successfully.
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.564 227317 DEBUG nova.virt.libvirt.vif [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T18:44:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1885647578',display_name='tempest-TestSnapshotPattern-server-1885647578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1885647578',id=29,image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCAYmVs+UW2XJsRtBIbdZbz28ZVdt7AiOxfdjjSsjnkL6p6XTA2fhA867rw0hqdCm+lPM0yPV4ff9dVLHk7OAzo0CgTYKG/4Lv9EiKZeI+OUhOQtFQJysHTnBrgkAFHfCQ==',key_name='tempest-TestSnapshotPattern-1728523139',keypairs=<?>,launch_index=0,launched_at=2026-01-26T18:44:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f6d1f7624fe846da936bdf952d988dca',ramdisk_id='',reservation_id='r-xy45ksu3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='57de5960-c1c5-4cfa-af34-8f58cf25f585',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-612206442',owner_user_name='tempest-TestSnapshotPattern-612206442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T18:45:37Z,user_data=None,user_id='ab4f5e4c36dd409fa5bb8295edb56a1e',uuid=0da4d154-1c5d-435f-bc88-07c4b9e6f79b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.565 227317 DEBUG nova.network.os_vif_util [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Converting VIF {"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.566 227317 DEBUG nova.network.os_vif_util [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.566 227317 DEBUG os_vif [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.567 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.568 227317 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2832c6c0-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.570 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.571 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 podman[265927]: 2026-01-26 18:46:45.57400958 +0000 UTC m=+0.134159860 container cleanup 22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.574 227317 INFO os_vif [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:80:6e,bridge_name='br-int',has_traffic_filtering=True,id=2832c6c0-b897-4481-8a2e-b13ebd13fdf7,network=Network(3c92bd0c-b67a-4232-823a-830d97d73785),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2832c6c0-b8')#033[00m
Jan 26 13:46:45 np0005596062 systemd[1]: libpod-conmon-22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b.scope: Deactivated successfully.
Jan 26 13:46:45 np0005596062 podman[265976]: 2026-01-26 18:46:45.641345547 +0000 UTC m=+0.043057127 container remove 22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.647 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[4b39508e-113d-4a4f-aab1-d2632e2c7677]: (4, ('Mon Jan 26 06:46:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785 (22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b)\n22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b\nMon Jan 26 06:46:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785 (22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b)\n22177ae060e0ac7d5c6494197a451f149b9883aaa9df1962b1902aa13f6c5b7b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.648 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[fd482bbe-425c-4956-a257-c2ea3f90eeda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.649 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c92bd0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.651 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 kernel: tap3c92bd0c-b0: left promiscuous mode
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.663 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.666 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[2bce810e-7f3e-424a-a97f-d4e1eca2917b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.685 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5a006d-ed0c-4ffb-9b01-445071c8d54d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.686 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[306f86d0-5caa-46b5-8d6f-cca914e5805f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.700 230329 DEBUG oslo.privsep.daemon [-] privsep: reply[036143d0-70c3-4b98-92f1-893a871379be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685331, 'reachable_time': 20004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266002, 'error': None, 'target': 'ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 systemd[1]: run-netns-ovnmeta\x2d3c92bd0c\x2db67a\x2d4232\x2d823a\x2d830d97d73785.mount: Deactivated successfully.
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.705 144040 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c92bd0c-b67a-4232-823a-830d97d73785 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 13:46:45 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:46:45.705 144040 DEBUG oslo.privsep.daemon [-] privsep: reply[b76a6e70-99ef-481e-9e82-87c1528cceb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.755 227317 DEBUG nova.compute.manager [req-e67e562d-a486-4f18-887a-44a9120e26da req-8c649216-917f-4a31-aae3-aca427e8c048 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-vif-unplugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.756 227317 DEBUG oslo_concurrency.lockutils [req-e67e562d-a486-4f18-887a-44a9120e26da req-8c649216-917f-4a31-aae3-aca427e8c048 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.756 227317 DEBUG oslo_concurrency.lockutils [req-e67e562d-a486-4f18-887a-44a9120e26da req-8c649216-917f-4a31-aae3-aca427e8c048 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.756 227317 DEBUG oslo_concurrency.lockutils [req-e67e562d-a486-4f18-887a-44a9120e26da req-8c649216-917f-4a31-aae3-aca427e8c048 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.756 227317 DEBUG nova.compute.manager [req-e67e562d-a486-4f18-887a-44a9120e26da req-8c649216-917f-4a31-aae3-aca427e8c048 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] No waiting events found dispatching network-vif-unplugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:46:45 np0005596062 nova_compute[227313]: 2026-01-26 18:46:45.756 227317 DEBUG nova.compute.manager [req-e67e562d-a486-4f18-887a-44a9120e26da req-8c649216-917f-4a31-aae3-aca427e8c048 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-vif-unplugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.042 227317 INFO nova.virt.libvirt.driver [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Deleting instance files /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b_del#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.042 227317 INFO nova.virt.libvirt.driver [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Deletion of /var/lib/nova/instances/0da4d154-1c5d-435f-bc88-07c4b9e6f79b_del complete#033[00m
Jan 26 13:46:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:46.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.136 227317 INFO nova.compute.manager [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.136 227317 DEBUG oslo.service.loopingcall [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.136 227317 DEBUG nova.compute.manager [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.136 227317 DEBUG nova.network.neutron [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 13:46:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:46.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.608 227317 DEBUG nova.network.neutron [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updated VIF entry in instance network info cache for port 2832c6c0-b897-4481-8a2e-b13ebd13fdf7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.609 227317 DEBUG nova.network.neutron [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [{"id": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "address": "fa:16:3e:68:80:6e", "network": {"id": "3c92bd0c-b67a-4232-823a-830d97d73785", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-964278989-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f6d1f7624fe846da936bdf952d988dca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2832c6c0-b8", "ovs_interfaceid": "2832c6c0-b897-4481-8a2e-b13ebd13fdf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.633 227317 DEBUG oslo_concurrency.lockutils [req-9e4fbf73-5e7c-4439-a402-ced5b927c940 req-7867b6b9-762f-4c41-8676-d49b6845a38b 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Releasing lock "refresh_cache-0da4d154-1c5d-435f-bc88-07c4b9e6f79b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 13:46:46 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.898 227317 DEBUG nova.network.neutron [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 13:46:46 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.924 227317 INFO nova.compute.manager [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Took 0.79 seconds to deallocate network for instance.#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:46.999 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.000 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.099 227317 DEBUG oslo_concurrency.processutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:46:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:46:47 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2210463450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.527 227317 DEBUG oslo_concurrency.processutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.532 227317 DEBUG nova.compute.provider_tree [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.561 227317 DEBUG nova.scheduler.client.report [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.587 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.614 227317 INFO nova.scheduler.client.report [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Deleted allocations for instance 0da4d154-1c5d-435f-bc88-07c4b9e6f79b#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.668 227317 DEBUG oslo_concurrency.lockutils [None req-7a14d3f1-a6d3-4189-9bf7-b49eac6103ef ab4f5e4c36dd409fa5bb8295edb56a1e f6d1f7624fe846da936bdf952d988dca - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.670 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.875 227317 DEBUG nova.compute.manager [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.876 227317 DEBUG oslo_concurrency.lockutils [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Acquiring lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.876 227317 DEBUG oslo_concurrency.lockutils [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.876 227317 DEBUG oslo_concurrency.lockutils [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] Lock "0da4d154-1c5d-435f-bc88-07c4b9e6f79b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.876 227317 DEBUG nova.compute.manager [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] No waiting events found dispatching network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.877 227317 WARNING nova.compute.manager [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received unexpected event network-vif-plugged-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 for instance with vm_state deleted and task_state None.#033[00m
Jan 26 13:46:47 np0005596062 nova_compute[227313]: 2026-01-26 18:46:47.877 227317 DEBUG nova.compute.manager [req-69327af1-7af1-4b0e-aa3b-160571f002f8 req-196ad4e6-6d42-4dd4-99fd-67c6d3b53ec5 7c80cb855ca14686bf519248f6e32904 f838374af7b94395a3a022cf51817435 - - default default] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Received event network-vif-deleted-2832c6c0-b897-4481-8a2e-b13ebd13fdf7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 13:46:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:48.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:48.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:50.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:50 np0005596062 nova_compute[227313]: 2026-01-26 18:46:50.571 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:51 np0005596062 nova_compute[227313]: 2026-01-26 18:46:51.136 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:51 np0005596062 nova_compute[227313]: 2026-01-26 18:46:51.290 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:51.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 26 13:46:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:52.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:52 np0005596062 nova_compute[227313]: 2026-01-26 18:46:52.671 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:53.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:54.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:55.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:55 np0005596062 nova_compute[227313]: 2026-01-26 18:46:55.575 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:46:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:56.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:46:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:57.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:57 np0005596062 nova_compute[227313]: 2026-01-26 18:46:57.673 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:46:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:46:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:46:58.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:46:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:46:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:46:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:46:59.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:00.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:00 np0005596062 nova_compute[227313]: 2026-01-26 18:47:00.546 227317 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769453205.545125, 0da4d154-1c5d-435f-bc88-07c4b9e6f79b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 13:47:00 np0005596062 nova_compute[227313]: 2026-01-26 18:47:00.547 227317 INFO nova.compute.manager [-] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] VM Stopped (Lifecycle Event)#033[00m
Jan 26 13:47:00 np0005596062 nova_compute[227313]: 2026-01-26 18:47:00.578 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:01 np0005596062 nova_compute[227313]: 2026-01-26 18:47:01.211 227317 DEBUG nova.compute.manager [None req-025152f2-194a-49ba-95a7-de76e7c4dba6 - - - - - -] [instance: 0da4d154-1c5d-435f-bc88-07c4b9e6f79b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 13:47:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:01.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:02.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:02 np0005596062 nova_compute[227313]: 2026-01-26 18:47:02.675 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:03.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:03 np0005596062 podman[266038]: 2026-01-26 18:47:03.836235484 +0000 UTC m=+0.051665367 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 13:47:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:04.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:05.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:05 np0005596062 nova_compute[227313]: 2026-01-26 18:47:05.614 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:06.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:07.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:07 np0005596062 nova_compute[227313]: 2026-01-26 18:47:07.678 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:47:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:08.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:47:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:47:09.199 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:47:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:47:09.200 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:47:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:47:09.200 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:47:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:09.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:10.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:10 np0005596062 nova_compute[227313]: 2026-01-26 18:47:10.617 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:11.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:12.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:12 np0005596062 nova_compute[227313]: 2026-01-26 18:47:12.680 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:13.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:14.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:14 np0005596062 podman[266113]: 2026-01-26 18:47:14.871401927 +0000 UTC m=+0.079841063 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 26 13:47:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:15.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:15 np0005596062 nova_compute[227313]: 2026-01-26 18:47:15.664 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:16.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:17.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:17 np0005596062 nova_compute[227313]: 2026-01-26 18:47:17.682 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:18.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:19.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:20 np0005596062 nova_compute[227313]: 2026-01-26 18:47:20.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:20 np0005596062 nova_compute[227313]: 2026-01-26 18:47:20.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:47:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:47:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:20.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:47:20 np0005596062 nova_compute[227313]: 2026-01-26 18:47:20.667 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:21 np0005596062 nova_compute[227313]: 2026-01-26 18:47:21.214 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:21 np0005596062 nova_compute[227313]: 2026-01-26 18:47:21.215 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:47:21 np0005596062 nova_compute[227313]: 2026-01-26 18:47:21.323 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:47:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:21.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:22.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.159 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.207 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.207 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.208 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.208 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.208 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:47:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:47:22 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/993756799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.684 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.687 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:47:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.843 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.844 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4714MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.844 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.845 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.925 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.926 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:47:22 np0005596062 nova_compute[227313]: 2026-01-26 18:47:22.948 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:47:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:47:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2866673892' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:47:23 np0005596062 nova_compute[227313]: 2026-01-26 18:47:23.404 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:47:23 np0005596062 nova_compute[227313]: 2026-01-26 18:47:23.409 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:47:23 np0005596062 nova_compute[227313]: 2026-01-26 18:47:23.430 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:47:23 np0005596062 nova_compute[227313]: 2026-01-26 18:47:23.448 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:47:23 np0005596062 nova_compute[227313]: 2026-01-26 18:47:23.448 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:47:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:23.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:23 np0005596062 nova_compute[227313]: 2026-01-26 18:47:23.658 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:23 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:47:23.659 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:47:23 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:47:23.659 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:47:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:24.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:25.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:25 np0005596062 nova_compute[227313]: 2026-01-26 18:47:25.670 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:26.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:27.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:27 np0005596062 nova_compute[227313]: 2026-01-26 18:47:27.687 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:47:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:28.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:47:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:29.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:30.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:30 np0005596062 nova_compute[227313]: 2026-01-26 18:47:30.674 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:31.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:32.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:32 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:47:32.661 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:47:32 np0005596062 nova_compute[227313]: 2026-01-26 18:47:32.690 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.339 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.339 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.339 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.385 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.385 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.386 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.386 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.386 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:33 np0005596062 nova_compute[227313]: 2026-01-26 18:47:33.386 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:47:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:33.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:47:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:34.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:47:34 np0005596062 podman[266245]: 2026-01-26 18:47:34.835542829 +0000 UTC m=+0.050687621 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:47:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:35.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:35 np0005596062 nova_compute[227313]: 2026-01-26 18:47:35.675 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:36 np0005596062 nova_compute[227313]: 2026-01-26 18:47:36.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:36 np0005596062 nova_compute[227313]: 2026-01-26 18:47:36.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:36.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:37.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:37 np0005596062 nova_compute[227313]: 2026-01-26 18:47:37.692 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:38 np0005596062 nova_compute[227313]: 2026-01-26 18:47:38.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:38.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:39.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:40.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:47:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/710477288' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:47:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:47:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/710477288' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:47:40 np0005596062 nova_compute[227313]: 2026-01-26 18:47:40.729 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:41.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:47:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:47:42 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:47:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:42.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:42 np0005596062 nova_compute[227313]: 2026-01-26 18:47:42.694 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:43.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:44.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:45.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:45 np0005596062 nova_compute[227313]: 2026-01-26 18:47:45.731 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:45 np0005596062 podman[266401]: 2026-01-26 18:47:45.884618525 +0000 UTC m=+0.105968694 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 26 13:47:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:46.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:47.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:47 np0005596062 nova_compute[227313]: 2026-01-26 18:47:47.697 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:48.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:47:48 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:47:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:49.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:50.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:50 np0005596062 nova_compute[227313]: 2026-01-26 18:47:50.772 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:51.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:52.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:52 np0005596062 nova_compute[227313]: 2026-01-26 18:47:52.698 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:53.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:47:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:54.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:47:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:55.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:55 np0005596062 nova_compute[227313]: 2026-01-26 18:47:55.776 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:56.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:57 np0005596062 nova_compute[227313]: 2026-01-26 18:47:57.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:47:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:57.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:57 np0005596062 nova_compute[227313]: 2026-01-26 18:47:57.700 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:47:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:47:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:47:58.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:47:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:47:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:47:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:47:59.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:00 np0005596062 nova_compute[227313]: 2026-01-26 18:48:00.778 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:00 np0005596062 ovn_controller[133984]: 2026-01-26T18:48:00Z|00248|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 26 13:48:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:01.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:02.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:02 np0005596062 nova_compute[227313]: 2026-01-26 18:48:02.701 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:02 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:03.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:04.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:05.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:05 np0005596062 nova_compute[227313]: 2026-01-26 18:48:05.829 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:05 np0005596062 podman[266538]: 2026-01-26 18:48:05.847547766 +0000 UTC m=+0.059360573 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:48:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:06.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:07.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:07 np0005596062 nova_compute[227313]: 2026-01-26 18:48:07.703 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:08.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:48:09.200 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:48:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:48:09.201 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:48:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:48:09.201 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:48:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:09.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:10.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:10 np0005596062 nova_compute[227313]: 2026-01-26 18:48:10.857 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:11.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:12.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:12 np0005596062 nova_compute[227313]: 2026-01-26 18:48:12.705 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:12 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:13.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:14.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:48:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:15.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:48:15 np0005596062 nova_compute[227313]: 2026-01-26 18:48:15.859 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:16.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:16 np0005596062 podman[266613]: 2026-01-26 18:48:16.881780775 +0000 UTC m=+0.092702017 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:48:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:17.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:17 np0005596062 nova_compute[227313]: 2026-01-26 18:48:17.758 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:18.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:19.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:20.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:20 np0005596062 nova_compute[227313]: 2026-01-26 18:48:20.861 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:21.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:22.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:22 np0005596062 nova_compute[227313]: 2026-01-26 18:48:22.761 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.103 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.147 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.148 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.148 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.148 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.149 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:48:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:23.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:48:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2870025123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.743 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.594s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.899 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.900 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4708MB free_disk=20.942890167236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.901 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:48:23 np0005596062 nova_compute[227313]: 2026-01-26 18:48:23.901 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.083 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.083 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.187 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:48:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:24.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.304 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.304 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.324 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.353 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.377 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:48:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:48:24 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/780747197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.807 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.813 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.837 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.839 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:48:24 np0005596062 nova_compute[227313]: 2026-01-26 18:48:24.839 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:48:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:25.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:25 np0005596062 nova_compute[227313]: 2026-01-26 18:48:25.865 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:26.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:26 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 26 13:48:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:26.993973) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:48:26 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 26 13:48:26 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453306994037, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1396, "num_deletes": 253, "total_data_size": 3049845, "memory_usage": 3094128, "flush_reason": "Manual Compaction"}
Jan 26 13:48:26 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453307015441, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 1235690, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52135, "largest_seqno": 53526, "table_properties": {"data_size": 1230963, "index_size": 2123, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12617, "raw_average_key_size": 21, "raw_value_size": 1220588, "raw_average_value_size": 2051, "num_data_blocks": 95, "num_entries": 595, "num_filter_entries": 595, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769453194, "oldest_key_time": 1769453194, "file_creation_time": 1769453306, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 21815 microseconds, and 4218 cpu microseconds.
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.015801) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 1235690 bytes OK
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.015903) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.026334) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.026367) EVENT_LOG_v1 {"time_micros": 1769453307026358, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.026389) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 3043348, prev total WAL file size 3043348, number of live WAL files 2.
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.027798) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373535' seq:72057594037927935, type:22 .. '6D6772737461740032303036' seq:0, type:0; will stop at (end)
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(1206KB)], [102(11MB)]
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453307027854, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13374871, "oldest_snapshot_seqno": -1}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 7420 keys, 10286619 bytes, temperature: kUnknown
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453307097800, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 10286619, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10240030, "index_size": 26957, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18565, "raw_key_size": 191460, "raw_average_key_size": 25, "raw_value_size": 10109954, "raw_average_value_size": 1362, "num_data_blocks": 1071, "num_entries": 7420, "num_filter_entries": 7420, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769453307, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.098062) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 10286619 bytes
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.099675) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 191.0 rd, 146.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 11.6 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(19.1) write-amplify(8.3) OK, records in: 7893, records dropped: 473 output_compression: NoCompression
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.099725) EVENT_LOG_v1 {"time_micros": 1769453307099713, "job": 64, "event": "compaction_finished", "compaction_time_micros": 70025, "compaction_time_cpu_micros": 23082, "output_level": 6, "num_output_files": 1, "total_output_size": 10286619, "num_input_records": 7893, "num_output_records": 7420, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453307100107, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453307103086, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.027719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.103147) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.103153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.103156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.103158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:48:27.103161) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:48:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:27.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:27 np0005596062 nova_compute[227313]: 2026-01-26 18:48:27.761 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:28.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:29.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:30.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:30 np0005596062 nova_compute[227313]: 2026-01-26 18:48:30.867 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:31.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:48:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:32.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:48:32 np0005596062 nova_compute[227313]: 2026-01-26 18:48:32.763 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:32 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:33.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.787 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.787 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.787 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.814 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.814 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.814 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.815 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:33 np0005596062 nova_compute[227313]: 2026-01-26 18:48:33.815 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:48:34 np0005596062 nova_compute[227313]: 2026-01-26 18:48:34.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:34.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:35.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:35 np0005596062 nova_compute[227313]: 2026-01-26 18:48:35.869 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:36 np0005596062 nova_compute[227313]: 2026-01-26 18:48:36.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:36.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:36 np0005596062 podman[266742]: 2026-01-26 18:48:36.838114222 +0000 UTC m=+0.044603668 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 13:48:37 np0005596062 nova_compute[227313]: 2026-01-26 18:48:37.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:37 np0005596062 nova_compute[227313]: 2026-01-26 18:48:37.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:37.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:37 np0005596062 nova_compute[227313]: 2026-01-26 18:48:37.765 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:37 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:38.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:39 np0005596062 nova_compute[227313]: 2026-01-26 18:48:39.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:48:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:39.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:40.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:40 np0005596062 nova_compute[227313]: 2026-01-26 18:48:40.920 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:41.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:42.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:42 np0005596062 nova_compute[227313]: 2026-01-26 18:48:42.767 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:42 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:43.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:44.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:45.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:45 np0005596062 nova_compute[227313]: 2026-01-26 18:48:45.922 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:46.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:47 np0005596062 podman[266841]: 2026-01-26 18:48:47.669529256 +0000 UTC m=+0.075774554 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 13:48:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:47.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:47 np0005596062 nova_compute[227313]: 2026-01-26 18:48:47.768 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:47 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:48.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:48:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:48:49 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:48:49 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 26 13:48:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:49.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:50.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:50 np0005596062 nova_compute[227313]: 2026-01-26 18:48:50.925 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:51 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 26 13:48:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:51.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 26 13:48:52 np0005596062 nova_compute[227313]: 2026-01-26 18:48:52.771 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:52 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 26 13:48:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:53.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:54.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:54 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:48:54 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:48:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:55.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:55 np0005596062 nova_compute[227313]: 2026-01-26 18:48:55.927 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:56.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:48:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:48:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:57.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:48:57 np0005596062 nova_compute[227313]: 2026-01-26 18:48:57.771 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:48:57 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:48:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:48:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:48:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:48:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:48:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:48:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:48:59.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:00.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:00 np0005596062 nova_compute[227313]: 2026-01-26 18:49:00.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:01.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:01 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 26 13:49:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:02.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:02 np0005596062 nova_compute[227313]: 2026-01-26 18:49:02.773 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:49:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1948368837' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:49:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:49:03 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1948368837' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:49:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:03.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:49:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:04.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:49:05 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 26 13:49:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:05.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:05 np0005596062 nova_compute[227313]: 2026-01-26 18:49:05.933 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:06.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:07.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:07 np0005596062 nova_compute[227313]: 2026-01-26 18:49:07.773 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:07 np0005596062 podman[267084]: 2026-01-26 18:49:07.859614583 +0000 UTC m=+0.066816683 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent)
Jan 26 13:49:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 26 13:49:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:49:09.202 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:49:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:49:09.203 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:49:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:49:09.203 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:49:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:09.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:49:09.789 143929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:b1:dd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:cd:89:5f:28:db'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 13:49:09 np0005596062 nova_compute[227313]: 2026-01-26 18:49:09.789 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:49:09.790 143929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 13:49:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:10 np0005596062 nova_compute[227313]: 2026-01-26 18:49:10.934 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:11.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:11 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:49:11.793 143929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9838f21e-c1ce-4cfa-829e-a12b9d657d8a, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 13:49:11 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 26 13:49:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:12.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:12 np0005596062 nova_compute[227313]: 2026-01-26 18:49:12.776 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:13.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:14.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:15.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:15 np0005596062 nova_compute[227313]: 2026-01-26 18:49:15.936 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:16.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 26 13:49:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:17.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:17 np0005596062 nova_compute[227313]: 2026-01-26 18:49:17.778 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:17 np0005596062 podman[267109]: 2026-01-26 18:49:17.901364348 +0000 UTC m=+0.102528160 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:49:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:18.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:19.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:20.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:20 np0005596062 nova_compute[227313]: 2026-01-26 18:49:20.938 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:21.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:22.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:22 np0005596062 nova_compute[227313]: 2026-01-26 18:49:22.781 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.111 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.112 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.112 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.112 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.112 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:49:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:49:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2709146096' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.550 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.695 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.696 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4727MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.697 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.697 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:49:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:23.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.786 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.786 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:49:23 np0005596062 nova_compute[227313]: 2026-01-26 18:49:23.813 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:49:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:49:24 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1658192091' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:49:24 np0005596062 nova_compute[227313]: 2026-01-26 18:49:24.259 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:49:24 np0005596062 nova_compute[227313]: 2026-01-26 18:49:24.266 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:49:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:24.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:24 np0005596062 nova_compute[227313]: 2026-01-26 18:49:24.287 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:49:24 np0005596062 nova_compute[227313]: 2026-01-26 18:49:24.288 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:49:24 np0005596062 nova_compute[227313]: 2026-01-26 18:49:24.288 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:49:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:25.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:25 np0005596062 nova_compute[227313]: 2026-01-26 18:49:25.941 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:26.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:27.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:27 np0005596062 nova_compute[227313]: 2026-01-26 18:49:27.783 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:28.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:29.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:30.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:30 np0005596062 nova_compute[227313]: 2026-01-26 18:49:30.943 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:31.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:32.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:32 np0005596062 nova_compute[227313]: 2026-01-26 18:49:32.786 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:33 np0005596062 nova_compute[227313]: 2026-01-26 18:49:33.289 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:33 np0005596062 nova_compute[227313]: 2026-01-26 18:49:33.290 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:33 np0005596062 nova_compute[227313]: 2026-01-26 18:49:33.290 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:33 np0005596062 nova_compute[227313]: 2026-01-26 18:49:33.290 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:49:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:33.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:34 np0005596062 nova_compute[227313]: 2026-01-26 18:49:34.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:34 np0005596062 nova_compute[227313]: 2026-01-26 18:49:34.050 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:49:34 np0005596062 nova_compute[227313]: 2026-01-26 18:49:34.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:49:34 np0005596062 nova_compute[227313]: 2026-01-26 18:49:34.074 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:49:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:34.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:35.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:35 np0005596062 nova_compute[227313]: 2026-01-26 18:49:35.945 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:36 np0005596062 nova_compute[227313]: 2026-01-26 18:49:36.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:36.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:49:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:37.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:49:37 np0005596062 nova_compute[227313]: 2026-01-26 18:49:37.802 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:38 np0005596062 nova_compute[227313]: 2026-01-26 18:49:38.045 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:38.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:38 np0005596062 podman[267242]: 2026-01-26 18:49:38.84316175 +0000 UTC m=+0.053759993 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 13:49:39 np0005596062 nova_compute[227313]: 2026-01-26 18:49:39.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:39 np0005596062 nova_compute[227313]: 2026-01-26 18:49:39.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:49:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:39.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:40.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:40 np0005596062 nova_compute[227313]: 2026-01-26 18:49:40.947 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:41.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:42.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:42 np0005596062 nova_compute[227313]: 2026-01-26 18:49:42.842 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:49:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:43.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:49:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:44.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:45.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:45 np0005596062 nova_compute[227313]: 2026-01-26 18:49:45.949 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:46.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:47.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:47 np0005596062 nova_compute[227313]: 2026-01-26 18:49:47.844 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:48.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:48 np0005596062 podman[267317]: 2026-01-26 18:49:48.932641575 +0000 UTC m=+0.140016297 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 26 13:49:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:49.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:50.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:50 np0005596062 nova_compute[227313]: 2026-01-26 18:49:50.951 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:51.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:52.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:52 np0005596062 nova_compute[227313]: 2026-01-26 18:49:52.846 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:53.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:54.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:55.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:55 np0005596062 nova_compute[227313]: 2026-01-26 18:49:55.953 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:56.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:49:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:49:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:49:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:49:56 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:49:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:49:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:49:57 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:49:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:49:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:57.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:49:57 np0005596062 nova_compute[227313]: 2026-01-26 18:49:57.894 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:49:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:49:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:49:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:49:58.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.763655) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453399763730, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 1294, "num_deletes": 254, "total_data_size": 2764345, "memory_usage": 2800544, "flush_reason": "Manual Compaction"}
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453399777035, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 1811570, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 53531, "largest_seqno": 54820, "table_properties": {"data_size": 1805789, "index_size": 3112, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12662, "raw_average_key_size": 20, "raw_value_size": 1794103, "raw_average_value_size": 2903, "num_data_blocks": 136, "num_entries": 618, "num_filter_entries": 618, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769453307, "oldest_key_time": 1769453307, "file_creation_time": 1769453399, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 13422 microseconds, and 4351 cpu microseconds.
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.777080) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 1811570 bytes OK
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.777097) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.779520) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.779532) EVENT_LOG_v1 {"time_micros": 1769453399779528, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.779549) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 2758189, prev total WAL file size 2758189, number of live WAL files 2.
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.780548) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(1769KB)], [105(10045KB)]
Jan 26 13:49:59 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453399780622, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 12098189, "oldest_snapshot_seqno": -1}
Jan 26 13:49:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:49:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:49:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:49:59.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:00.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7511 keys, 10149369 bytes, temperature: kUnknown
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453400322896, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 10149369, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10102048, "index_size": 27433, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18821, "raw_key_size": 194162, "raw_average_key_size": 25, "raw_value_size": 9970205, "raw_average_value_size": 1327, "num_data_blocks": 1085, "num_entries": 7511, "num_filter_entries": 7511, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769453399, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.323267) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 10149369 bytes
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.528456) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 22.3 rd, 18.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.8 +0.0 blob) out(9.7 +0.0 blob), read-write-amplify(12.3) write-amplify(5.6) OK, records in: 8038, records dropped: 527 output_compression: NoCompression
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.528491) EVENT_LOG_v1 {"time_micros": 1769453400528477, "job": 66, "event": "compaction_finished", "compaction_time_micros": 542381, "compaction_time_cpu_micros": 27479, "output_level": 6, "num_output_files": 1, "total_output_size": 10149369, "num_input_records": 8038, "num_output_records": 7511, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453400528974, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453400530662, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:49:59.780432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.530767) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.530775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.530779) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.530783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:00 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:00.530788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:00 np0005596062 nova_compute[227313]: 2026-01-26 18:50:00.956 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:01 np0005596062 ceph-mon[77178]: overall HEALTH_OK
Jan 26 13:50:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:01.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:02.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:02 np0005596062 nova_compute[227313]: 2026-01-26 18:50:02.896 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:03.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:50:03 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:50:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:04.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:05.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:05 np0005596062 nova_compute[227313]: 2026-01-26 18:50:05.959 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:06.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:07.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:07 np0005596062 nova_compute[227313]: 2026-01-26 18:50:07.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:08.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:50:09.203 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:50:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:50:09.204 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:50:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:50:09.204 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:50:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:09.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:09 np0005596062 podman[267585]: 2026-01-26 18:50:09.837546066 +0000 UTC m=+0.049291633 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 13:50:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:10 np0005596062 nova_compute[227313]: 2026-01-26 18:50:10.960 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:50:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:11.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:50:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:12 np0005596062 nova_compute[227313]: 2026-01-26 18:50:12.931 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:13.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:14.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:15.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:15 np0005596062 nova_compute[227313]: 2026-01-26 18:50:15.963 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:16.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:17.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:17 np0005596062 nova_compute[227313]: 2026-01-26 18:50:17.979 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:18.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:19.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:19 np0005596062 podman[267610]: 2026-01-26 18:50:19.923295161 +0000 UTC m=+0.126517625 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller)
Jan 26 13:50:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:20.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:20 np0005596062 nova_compute[227313]: 2026-01-26 18:50:20.964 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:21.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:23 np0005596062 nova_compute[227313]: 2026-01-26 18:50:23.012 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:23.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:24.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.081 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.082 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:50:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:50:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1574209475' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.534 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.707 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.708 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4706MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.709 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.709 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.787 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.787 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.802 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:50:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:25.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:25 np0005596062 nova_compute[227313]: 2026-01-26 18:50:25.967 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1341406158' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:50:26 np0005596062 nova_compute[227313]: 2026-01-26 18:50:26.216 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:50:26 np0005596062 nova_compute[227313]: 2026-01-26 18:50:26.221 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:50:26 np0005596062 nova_compute[227313]: 2026-01-26 18:50:26.259 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:50:26 np0005596062 nova_compute[227313]: 2026-01-26 18:50:26.261 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:50:26 np0005596062 nova_compute[227313]: 2026-01-26 18:50:26.261 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:50:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:50:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:26.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.905848) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453426905900, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 514, "num_deletes": 256, "total_data_size": 727181, "memory_usage": 736792, "flush_reason": "Manual Compaction"}
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453426913030, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 479631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54825, "largest_seqno": 55334, "table_properties": {"data_size": 476890, "index_size": 777, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6417, "raw_average_key_size": 18, "raw_value_size": 471346, "raw_average_value_size": 1350, "num_data_blocks": 34, "num_entries": 349, "num_filter_entries": 349, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769453401, "oldest_key_time": 1769453401, "file_creation_time": 1769453426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 7221 microseconds, and 2920 cpu microseconds.
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.913072) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 479631 bytes OK
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.913092) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.915154) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.915169) EVENT_LOG_v1 {"time_micros": 1769453426915164, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.915185) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 724143, prev total WAL file size 724143, number of live WAL files 2.
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.915710) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373630' seq:72057594037927935, type:22 .. '6C6F676D0032303132' seq:0, type:0; will stop at (end)
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(468KB)], [108(9911KB)]
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453426915748, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 10629000, "oldest_snapshot_seqno": -1}
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 7336 keys, 10497745 bytes, temperature: kUnknown
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453426996906, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 10497745, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10450729, "index_size": 27576, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18373, "raw_key_size": 191451, "raw_average_key_size": 26, "raw_value_size": 10321000, "raw_average_value_size": 1406, "num_data_blocks": 1089, "num_entries": 7336, "num_filter_entries": 7336, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769453426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.997290) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 10497745 bytes
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.999330) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.8 rd, 129.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 9.7 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(44.0) write-amplify(21.9) OK, records in: 7860, records dropped: 524 output_compression: NoCompression
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.999396) EVENT_LOG_v1 {"time_micros": 1769453426999380, "job": 68, "event": "compaction_finished", "compaction_time_micros": 81285, "compaction_time_cpu_micros": 23768, "output_level": 6, "num_output_files": 1, "total_output_size": 10497745, "num_input_records": 7860, "num_output_records": 7336, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:50:26 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453426999833, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453427003334, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:26.915610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:27.003418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:27.003425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:27.003428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:27.003431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:27 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:50:27.003434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:50:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:27.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:28 np0005596062 nova_compute[227313]: 2026-01-26 18:50:28.014 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:28.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:29.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:30.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:30 np0005596062 nova_compute[227313]: 2026-01-26 18:50:30.969 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:31.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:32.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:33 np0005596062 nova_compute[227313]: 2026-01-26 18:50:33.015 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:33 np0005596062 nova_compute[227313]: 2026-01-26 18:50:33.262 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:33 np0005596062 nova_compute[227313]: 2026-01-26 18:50:33.262 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:50:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:33.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:34 np0005596062 nova_compute[227313]: 2026-01-26 18:50:34.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:34 np0005596062 nova_compute[227313]: 2026-01-26 18:50:34.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:50:34 np0005596062 nova_compute[227313]: 2026-01-26 18:50:34.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:50:34 np0005596062 nova_compute[227313]: 2026-01-26 18:50:34.068 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:50:34 np0005596062 nova_compute[227313]: 2026-01-26 18:50:34.069 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:34 np0005596062 nova_compute[227313]: 2026-01-26 18:50:34.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:34.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:35.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:35 np0005596062 nova_compute[227313]: 2026-01-26 18:50:35.976 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:36.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:37 np0005596062 nova_compute[227313]: 2026-01-26 18:50:37.065 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:37.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:38 np0005596062 nova_compute[227313]: 2026-01-26 18:50:38.017 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:38 np0005596062 nova_compute[227313]: 2026-01-26 18:50:38.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:38.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:39.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:40 np0005596062 nova_compute[227313]: 2026-01-26 18:50:40.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:40 np0005596062 nova_compute[227313]: 2026-01-26 18:50:40.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:40.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:40 np0005596062 podman[267741]: 2026-01-26 18:50:40.877729478 +0000 UTC m=+0.086042469 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 13:50:41 np0005596062 nova_compute[227313]: 2026-01-26 18:50:41.020 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:41 np0005596062 nova_compute[227313]: 2026-01-26 18:50:41.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:50:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:41.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:42.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:43 np0005596062 nova_compute[227313]: 2026-01-26 18:50:43.020 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:43.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:44.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:45.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:46 np0005596062 nova_compute[227313]: 2026-01-26 18:50:46.023 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:50:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:47.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:50:48 np0005596062 nova_compute[227313]: 2026-01-26 18:50:48.057 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:48.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:49.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:50.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:50 np0005596062 podman[267815]: 2026-01-26 18:50:50.853860794 +0000 UTC m=+0.068709544 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 26 13:50:51 np0005596062 nova_compute[227313]: 2026-01-26 18:50:51.024 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:51.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:52.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:53 np0005596062 nova_compute[227313]: 2026-01-26 18:50:53.058 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:53.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:54.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:55.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:56 np0005596062 nova_compute[227313]: 2026-01-26 18:50:56.026 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:56.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:50:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:57.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:50:58 np0005596062 nova_compute[227313]: 2026-01-26 18:50:58.059 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:50:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:50:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:50:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:50:58.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:50:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:50:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:50:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:50:59.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:51:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:01 np0005596062 nova_compute[227313]: 2026-01-26 18:51:01.075 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:01.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:02.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:03 np0005596062 nova_compute[227313]: 2026-01-26 18:51:03.062 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:03.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:04.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:51:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:51:04 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:51:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:05.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:06 np0005596062 nova_compute[227313]: 2026-01-26 18:51:06.130 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:06.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:07.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:08 np0005596062 nova_compute[227313]: 2026-01-26 18:51:08.064 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:08.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:51:09.205 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:51:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:51:09.205 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:51:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:51:09.205 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:51:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:09.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:10.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:11 np0005596062 nova_compute[227313]: 2026-01-26 18:51:11.186 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:11 np0005596062 podman[268057]: 2026-01-26 18:51:11.322906645 +0000 UTC m=+0.050683500 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:51:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:51:11 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:51:11 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:11 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:11 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:11.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:12.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:13 np0005596062 nova_compute[227313]: 2026-01-26 18:51:13.064 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:13 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:13 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:13 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:13.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:14.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:15 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:15 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:15 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:15.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:16 np0005596062 nova_compute[227313]: 2026-01-26 18:51:16.187 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:16.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:17 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:17 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:17 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:17.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:18 np0005596062 nova_compute[227313]: 2026-01-26 18:51:18.067 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:18.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:19 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:19 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:19 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:19.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:20.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:21 np0005596062 nova_compute[227313]: 2026-01-26 18:51:21.189 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:21 np0005596062 podman[268108]: 2026-01-26 18:51:21.877593958 +0000 UTC m=+0.089671026 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 13:51:21 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:21 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:21 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:21.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:22.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:23 np0005596062 nova_compute[227313]: 2026-01-26 18:51:23.070 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:23 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:23 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:23 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:23.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:24.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:25 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:25 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:25 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:25.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.081 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.082 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.082 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.082 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.243 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:26.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:51:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1849331919' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.536 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.696 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.698 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4714MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.698 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.698 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.784 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.785 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:51:26 np0005596062 nova_compute[227313]: 2026-01-26 18:51:26.805 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:51:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:51:27 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/7107380' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:51:27 np0005596062 nova_compute[227313]: 2026-01-26 18:51:27.272 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:51:27 np0005596062 nova_compute[227313]: 2026-01-26 18:51:27.279 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:51:27 np0005596062 nova_compute[227313]: 2026-01-26 18:51:27.303 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:51:27 np0005596062 nova_compute[227313]: 2026-01-26 18:51:27.305 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:51:27 np0005596062 nova_compute[227313]: 2026-01-26 18:51:27.305 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:51:27 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:27 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:27 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:27.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:28 np0005596062 nova_compute[227313]: 2026-01-26 18:51:28.072 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:28.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:29 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:29 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:29 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:29.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:30.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:31 np0005596062 nova_compute[227313]: 2026-01-26 18:51:31.278 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:31 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:31 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:31 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:31.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:32.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:33 np0005596062 nova_compute[227313]: 2026-01-26 18:51:33.075 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:33 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:33 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:33 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:33.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:34 np0005596062 nova_compute[227313]: 2026-01-26 18:51:34.305 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:34 np0005596062 nova_compute[227313]: 2026-01-26 18:51:34.306 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:51:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:34.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:35 np0005596062 nova_compute[227313]: 2026-01-26 18:51:35.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:35 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:35 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:35 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:35.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:36 np0005596062 nova_compute[227313]: 2026-01-26 18:51:36.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:36 np0005596062 nova_compute[227313]: 2026-01-26 18:51:36.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:51:36 np0005596062 nova_compute[227313]: 2026-01-26 18:51:36.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:51:36 np0005596062 nova_compute[227313]: 2026-01-26 18:51:36.066 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:51:36 np0005596062 nova_compute[227313]: 2026-01-26 18:51:36.066 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:36 np0005596062 nova_compute[227313]: 2026-01-26 18:51:36.281 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:36.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:37 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:37 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:37 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:37.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:38 np0005596062 nova_compute[227313]: 2026-01-26 18:51:38.142 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:38.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:39 np0005596062 nova_compute[227313]: 2026-01-26 18:51:39.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:39 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:39 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:39 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:39.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:40 np0005596062 nova_compute[227313]: 2026-01-26 18:51:40.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:40.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:41 np0005596062 nova_compute[227313]: 2026-01-26 18:51:41.345 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:41 np0005596062 podman[268241]: 2026-01-26 18:51:41.837537834 +0000 UTC m=+0.053996030 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 13:51:41 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:41 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:41 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:41.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:42 np0005596062 nova_compute[227313]: 2026-01-26 18:51:42.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:42 np0005596062 nova_compute[227313]: 2026-01-26 18:51:42.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:51:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:51:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:42.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:51:43 np0005596062 nova_compute[227313]: 2026-01-26 18:51:43.144 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:51:43 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 11K writes, 56K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.11 GB, 0.03 MB/s#012Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.11 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1582 writes, 7680 keys, 1582 commit groups, 1.0 writes per commit group, ingest: 16.03 MB, 0.03 MB/s#012Interval WAL: 1581 writes, 1581 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     67.8      1.03              0.24        34    0.030       0      0       0.0       0.0#012  L6      1/0   10.01 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.3     85.8     71.6      4.21              0.97        33    0.127    201K    18K       0.0       0.0#012 Sum      1/0   10.01 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.3     68.9     70.8      5.24              1.21        67    0.078    201K    18K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0     64.3     65.5      1.09              0.27        12    0.091     46K   3100       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0     85.8     71.6      4.21              0.97        33    0.127    201K    18K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     67.9      1.03              0.24        33    0.031       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.068, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.36 GB write, 0.09 MB/s write, 0.35 GB read, 0.09 MB/s read, 5.2 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d9cbc8f1f0#2 capacity: 304.00 MB usage: 42.18 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000257 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2396,40.69 MB,13.3844%) FilterBlock(67,571.92 KB,0.183723%) IndexBlock(67,950.20 KB,0.305241%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 13:51:43 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:43 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:43 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:43.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:44.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:45 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:45 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:45 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:45.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:46 np0005596062 nova_compute[227313]: 2026-01-26 18:51:46.347 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:47 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:47 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:47 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:47.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:48 np0005596062 nova_compute[227313]: 2026-01-26 18:51:48.146 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:48.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:49 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:49 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:49 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:49.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:50.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:51 np0005596062 nova_compute[227313]: 2026-01-26 18:51:51.349 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:51 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:51 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:51 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:51.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:51:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:52.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:52 np0005596062 podman[268316]: 2026-01-26 18:51:52.877002188 +0000 UTC m=+0.089583154 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:51:53 np0005596062 nova_compute[227313]: 2026-01-26 18:51:53.148 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:53 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:53 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:53 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:53.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:54.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:55 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:55 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:55 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:55.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:56 np0005596062 nova_compute[227313]: 2026-01-26 18:51:56.352 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:56.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:57 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:57 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:51:57 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:57.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:51:58 np0005596062 nova_compute[227313]: 2026-01-26 18:51:58.151 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:51:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:51:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:51:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:51:58.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:51:59 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:51:59 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:51:59 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:51:59.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:00.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:01 np0005596062 nova_compute[227313]: 2026-01-26 18:52:01.411 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:01 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:01 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:01 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:02.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:03 np0005596062 nova_compute[227313]: 2026-01-26 18:52:03.153 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:03 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:03 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:03 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:04.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:05 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:05 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:05 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:05.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:06 np0005596062 nova_compute[227313]: 2026-01-26 18:52:06.415 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:06.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:07 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:07 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:07 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:07.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:08 np0005596062 nova_compute[227313]: 2026-01-26 18:52:08.156 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:08.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:52:09.206 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:52:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:52:09.206 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:52:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:52:09.206 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:52:09 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:09 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:09 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:09.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:10.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:11 np0005596062 nova_compute[227313]: 2026-01-26 18:52:11.418 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:11.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:12 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:52:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:12.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:12 np0005596062 podman[268532]: 2026-01-26 18:52:12.881227682 +0000 UTC m=+0.080693335 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 13:52:13 np0005596062 nova_compute[227313]: 2026-01-26 18:52:13.158 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:52:13 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:52:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:14.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:14.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:52:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:16.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:52:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:52:16 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 20K writes, 73K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 20K writes, 7078 syncs, 2.87 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2124 writes, 6160 keys, 2124 commit groups, 1.0 writes per commit group, ingest: 4.65 MB, 0.01 MB/s#012Interval WAL: 2124 writes, 952 syncs, 2.23 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:52:16 np0005596062 nova_compute[227313]: 2026-01-26 18:52:16.421 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:16.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:18.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:18 np0005596062 nova_compute[227313]: 2026-01-26 18:52:18.159 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:18.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:52:19 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:52:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:20.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:20 np0005596062 nova_compute[227313]: 2026-01-26 18:52:20.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:20 np0005596062 nova_compute[227313]: 2026-01-26 18:52:20.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 13:52:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:20.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:21 np0005596062 nova_compute[227313]: 2026-01-26 18:52:21.423 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:22.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:22.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:23 np0005596062 nova_compute[227313]: 2026-01-26 18:52:23.185 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:23 np0005596062 nova_compute[227313]: 2026-01-26 18:52:23.236 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:23 np0005596062 nova_compute[227313]: 2026-01-26 18:52:23.237 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 13:52:23 np0005596062 nova_compute[227313]: 2026-01-26 18:52:23.256 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 13:52:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:23 np0005596062 podman[268607]: 2026-01-26 18:52:23.887797044 +0000 UTC m=+0.097432445 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 13:52:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:24.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:24.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:26.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:26 np0005596062 nova_compute[227313]: 2026-01-26 18:52:26.426 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:26.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:28.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.070 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.096 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.096 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.096 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.187 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:28.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:52:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1680594548' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.609 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.776 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.777 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4712MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.777 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.778 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.848 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.849 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:52:28 np0005596062 nova_compute[227313]: 2026-01-26 18:52:28.864 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:52:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:52:29 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4059686355' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:52:29 np0005596062 nova_compute[227313]: 2026-01-26 18:52:29.328 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:52:29 np0005596062 nova_compute[227313]: 2026-01-26 18:52:29.335 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:52:29 np0005596062 nova_compute[227313]: 2026-01-26 18:52:29.350 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:52:29 np0005596062 nova_compute[227313]: 2026-01-26 18:52:29.352 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:52:29 np0005596062 nova_compute[227313]: 2026-01-26 18:52:29.352 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:52:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:30.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:30.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:31 np0005596062 nova_compute[227313]: 2026-01-26 18:52:31.460 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:32.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:32.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:33 np0005596062 nova_compute[227313]: 2026-01-26 18:52:33.190 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:34.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:34 np0005596062 nova_compute[227313]: 2026-01-26 18:52:34.333 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:34 np0005596062 nova_compute[227313]: 2026-01-26 18:52:34.333 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:52:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:34.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:36.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:36 np0005596062 nova_compute[227313]: 2026-01-26 18:52:36.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:36 np0005596062 nova_compute[227313]: 2026-01-26 18:52:36.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:52:36 np0005596062 nova_compute[227313]: 2026-01-26 18:52:36.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:52:36 np0005596062 nova_compute[227313]: 2026-01-26 18:52:36.064 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:52:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:36.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:36 np0005596062 nova_compute[227313]: 2026-01-26 18:52:36.463 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:37 np0005596062 nova_compute[227313]: 2026-01-26 18:52:37.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:37 np0005596062 nova_compute[227313]: 2026-01-26 18:52:37.051 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:38.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:38 np0005596062 nova_compute[227313]: 2026-01-26 18:52:38.192 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:38.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:40.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 26 13:52:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2857574485' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 13:52:40 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 26 13:52:40 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2857574485' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 13:52:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:52:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:40.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:52:41 np0005596062 nova_compute[227313]: 2026-01-26 18:52:41.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:41 np0005596062 nova_compute[227313]: 2026-01-26 18:52:41.062 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:41 np0005596062 nova_compute[227313]: 2026-01-26 18:52:41.515 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:42.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:42 np0005596062 nova_compute[227313]: 2026-01-26 18:52:42.060 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:42.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:43 np0005596062 nova_compute[227313]: 2026-01-26 18:52:43.192 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:43 np0005596062 podman[268737]: 2026-01-26 18:52:43.872681139 +0000 UTC m=+0.084685032 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 13:52:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:44.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:44 np0005596062 nova_compute[227313]: 2026-01-26 18:52:44.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:44 np0005596062 nova_compute[227313]: 2026-01-26 18:52:44.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:44.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:46.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:46.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:46 np0005596062 nova_compute[227313]: 2026-01-26 18:52:46.517 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:48.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:48 np0005596062 nova_compute[227313]: 2026-01-26 18:52:48.194 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:48.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:50.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:50.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:51 np0005596062 nova_compute[227313]: 2026-01-26 18:52:51.567 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:52.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:52.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:53 np0005596062 nova_compute[227313]: 2026-01-26 18:52:53.196 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:54.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:54.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:54 np0005596062 podman[268812]: 2026-01-26 18:52:54.953473072 +0000 UTC m=+0.159902470 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller)
Jan 26 13:52:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:56.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:52:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:56.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:52:56 np0005596062 nova_compute[227313]: 2026-01-26 18:52:56.571 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:58 np0005596062 nova_compute[227313]: 2026-01-26 18:52:58.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:52:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:52:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:52:58.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:52:58 np0005596062 nova_compute[227313]: 2026-01-26 18:52:58.235 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:52:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:52:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:52:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:52:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:52:58.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:53:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:53:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:00.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:53:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:00.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:01 np0005596062 nova_compute[227313]: 2026-01-26 18:53:01.616 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:02.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:02.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:02 np0005596062 nova_compute[227313]: 2026-01-26 18:53:02.632 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:03 np0005596062 nova_compute[227313]: 2026-01-26 18:53:03.237 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:04.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:04.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:06.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:53:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:06.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:53:06 np0005596062 nova_compute[227313]: 2026-01-26 18:53:06.619 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:08.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:08 np0005596062 nova_compute[227313]: 2026-01-26 18:53:08.239 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:08.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:53:09.207 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:53:09.208 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:53:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:53:09.208 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:53:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:10.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:10.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:11 np0005596062 nova_compute[227313]: 2026-01-26 18:53:11.621 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:12.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:12.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:13 np0005596062 nova_compute[227313]: 2026-01-26 18:53:13.241 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:14.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:14.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:14 np0005596062 podman[268898]: 2026-01-26 18:53:14.833155275 +0000 UTC m=+0.050131635 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 26 13:53:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:16.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:16.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:16 np0005596062 nova_compute[227313]: 2026-01-26 18:53:16.623 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:18.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:18 np0005596062 nova_compute[227313]: 2026-01-26 18:53:18.292 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:18.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:19 np0005596062 podman[269091]: 2026-01-26 18:53:19.527075459 +0000 UTC m=+0.061310285 container exec 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 13:53:19 np0005596062 podman[269091]: 2026-01-26 18:53:19.622034136 +0000 UTC m=+0.156268962 container exec_died 0054c4cc1a1e964917431edbd72f8dd082fcc5d67ead715426b23b35e604d4df (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Jan 26 13:53:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:20.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:20 np0005596062 podman[269246]: 2026-01-26 18:53:20.293054083 +0000 UTC m=+0.067726007 container exec 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:53:20 np0005596062 podman[269246]: 2026-01-26 18:53:20.334100834 +0000 UTC m=+0.108772708 container exec_died 162db9b424067387668f73320464776d40b6b552f250ff2376b6c062a433fa92 (image=quay.io/ceph/haproxy:2.3, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-haproxy-rgw-default-compute-2-dyvhne)
Jan 26 13:53:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:20.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:20 np0005596062 podman[269311]: 2026-01-26 18:53:20.583464963 +0000 UTC m=+0.060001781 container exec 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, description=keepalived for Ceph)
Jan 26 13:53:20 np0005596062 podman[269311]: 2026-01-26 18:53:20.618377259 +0000 UTC m=+0.094914047 container exec_died 339afa45a428a62db0ddc984419f4dff934d9c227e7842213d116d2e774d6198 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d4cd1917-5876-51b6-bc64-65a16199754d-keepalived-rgw-default-compute-2-alfrff, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, distribution-scope=public, release=1793)
Jan 26 13:53:21 np0005596062 nova_compute[227313]: 2026-01-26 18:53:21.650 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:53:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:53:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:53:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:53:21 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:53:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:22.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:53:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:22.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:53:23 np0005596062 nova_compute[227313]: 2026-01-26 18:53:23.295 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:24.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:25 np0005596062 podman[269476]: 2026-01-26 18:53:25.913902387 +0000 UTC m=+0.120880423 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 13:53:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:26.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:26 np0005596062 nova_compute[227313]: 2026-01-26 18:53:26.652 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.077 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.077 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.077 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.078 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.078 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:53:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:28.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.296 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:53:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3412769831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:53:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:28.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.517 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:53:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:53:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.673 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.674 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4702MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.674 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.675 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.849 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.849 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.909 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing inventories for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.960 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating ProviderTree inventory for provider 65600a65-69bc-488c-8c8c-71cbf43e523a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.961 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Updating inventory in ProviderTree for provider 65600a65-69bc-488c-8c8c-71cbf43e523a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.973 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing aggregate associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 13:53:28 np0005596062 nova_compute[227313]: 2026-01-26 18:53:28.990 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Refreshing trait associations for resource provider 65600a65-69bc-488c-8c8c-71cbf43e523a, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 13:53:29 np0005596062 nova_compute[227313]: 2026-01-26 18:53:29.003 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:53:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:53:29 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2252812177' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:53:29 np0005596062 nova_compute[227313]: 2026-01-26 18:53:29.423 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:53:29 np0005596062 nova_compute[227313]: 2026-01-26 18:53:29.429 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:53:29 np0005596062 nova_compute[227313]: 2026-01-26 18:53:29.454 227317 DEBUG nova.scheduler.client.report [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed for provider 65600a65-69bc-488c-8c8c-71cbf43e523a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 13:53:29 np0005596062 nova_compute[227313]: 2026-01-26 18:53:29.456 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 13:53:29 np0005596062 nova_compute[227313]: 2026-01-26 18:53:29.456 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.781s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:53:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:30.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:30.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:31 np0005596062 nova_compute[227313]: 2026-01-26 18:53:31.655 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:32.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:32 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:32 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:32 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:32.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:33 np0005596062 nova_compute[227313]: 2026-01-26 18:53:33.298 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:33 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:34.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:34 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:34 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:34 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:34.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:35 np0005596062 nova_compute[227313]: 2026-01-26 18:53:35.458 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:35 np0005596062 nova_compute[227313]: 2026-01-26 18:53:35.458 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 13:53:36 np0005596062 nova_compute[227313]: 2026-01-26 18:53:36.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:36 np0005596062 nova_compute[227313]: 2026-01-26 18:53:36.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 13:53:36 np0005596062 nova_compute[227313]: 2026-01-26 18:53:36.051 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 13:53:36 np0005596062 nova_compute[227313]: 2026-01-26 18:53:36.072 227317 DEBUG nova.compute.manager [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 13:53:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:36.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:36 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:36 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:36 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:36.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:36 np0005596062 nova_compute[227313]: 2026-01-26 18:53:36.657 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:38.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:38 np0005596062 nova_compute[227313]: 2026-01-26 18:53:38.301 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:38 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:38 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:38 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:38 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:38.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:39 np0005596062 nova_compute[227313]: 2026-01-26 18:53:39.049 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:39 np0005596062 nova_compute[227313]: 2026-01-26 18:53:39.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:40.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:40 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:40 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:40 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:40.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:41 np0005596062 nova_compute[227313]: 2026-01-26 18:53:41.745 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.935784) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453621935862, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 2095, "num_deletes": 251, "total_data_size": 5107300, "memory_usage": 5193776, "flush_reason": "Manual Compaction"}
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453621992780, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 3347568, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55339, "largest_seqno": 57429, "table_properties": {"data_size": 3339055, "index_size": 5263, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17357, "raw_average_key_size": 20, "raw_value_size": 3322103, "raw_average_value_size": 3862, "num_data_blocks": 231, "num_entries": 860, "num_filter_entries": 860, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769453427, "oldest_key_time": 1769453427, "file_creation_time": 1769453621, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 57070 microseconds, and 7468 cpu microseconds.
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.992857) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 3347568 bytes OK
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.992879) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.994827) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.994843) EVENT_LOG_v1 {"time_micros": 1769453621994837, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.994861) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 5098120, prev total WAL file size 5098120, number of live WAL files 2.
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.996026) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(3269KB)], [111(10MB)]
Jan 26 13:53:41 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453621996074, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 13845313, "oldest_snapshot_seqno": -1}
Jan 26 13:53:42 np0005596062 nova_compute[227313]: 2026-01-26 18:53:42.046 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 7679 keys, 11848451 bytes, temperature: kUnknown
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453622110646, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 11848451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11798056, "index_size": 30123, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19205, "raw_key_size": 199238, "raw_average_key_size": 25, "raw_value_size": 11661236, "raw_average_value_size": 1518, "num_data_blocks": 1193, "num_entries": 7679, "num_filter_entries": 7679, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769449303, "oldest_key_time": 0, "file_creation_time": 1769453621, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "672fd1c3-93d2-431e-9d5a-4531180f45cc", "db_session_id": "WVAUTHFR912YXSABJRD6", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.111017) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 11848451 bytes
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.113778) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.6 rd, 103.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 10.0 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 8196, records dropped: 517 output_compression: NoCompression
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.113830) EVENT_LOG_v1 {"time_micros": 1769453622113810, "job": 70, "event": "compaction_finished", "compaction_time_micros": 114761, "compaction_time_cpu_micros": 26287, "output_level": 6, "num_output_files": 1, "total_output_size": 11848451, "num_input_records": 8196, "num_output_records": 7679, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453622114673, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769453622117552, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:41.995960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.117655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.117661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.117662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.117664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:53:42 np0005596062 ceph-mon[77178]: rocksdb: (Original Log Time 2026/01/26-18:53:42.117667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 13:53:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:42.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:42 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:42 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:42 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:43 np0005596062 nova_compute[227313]: 2026-01-26 18:53:43.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:43 np0005596062 nova_compute[227313]: 2026-01-26 18:53:43.302 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:43 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:44 np0005596062 nova_compute[227313]: 2026-01-26 18:53:44.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:44.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:44 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:44 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:44 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:44.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:45 np0005596062 nova_compute[227313]: 2026-01-26 18:53:45.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:53:45 np0005596062 podman[269655]: 2026-01-26 18:53:45.875863928 +0000 UTC m=+0.087226581 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 13:53:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:46.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:46 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:46 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:46 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:46.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:46 np0005596062 nova_compute[227313]: 2026-01-26 18:53:46.799 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:48 np0005596062 nova_compute[227313]: 2026-01-26 18:53:48.304 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:48 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:53:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:48.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:53:48 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:48 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:48 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:48.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:50.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:50 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:50 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:50 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:50.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:51 np0005596062 nova_compute[227313]: 2026-01-26 18:53:51.801 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:52.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:52 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:52 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:52 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:52.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:53 np0005596062 nova_compute[227313]: 2026-01-26 18:53:53.344 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:53 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:54.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:54 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:54 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:53:54 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:54.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:53:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:56.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:56 np0005596062 nova_compute[227313]: 2026-01-26 18:53:56.802 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:56 np0005596062 podman[269730]: 2026-01-26 18:53:56.860516941 +0000 UTC m=+0.073429399 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 13:53:56 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:56 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:56 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:56.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:58 np0005596062 nova_compute[227313]: 2026-01-26 18:53:58.345 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:53:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:53:58.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:53:58 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:53:58 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:53:58 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:53:58 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:53:58.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:00.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:00 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:00 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:54:00 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:00.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:54:01 np0005596062 nova_compute[227313]: 2026-01-26 18:54:01.867 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:02.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:02 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:02 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:02 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:02.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:03 np0005596062 nova_compute[227313]: 2026-01-26 18:54:03.349 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:03 np0005596062 systemd-logind[781]: New session 51 of user zuul.
Jan 26 13:54:03 np0005596062 systemd[1]: Started Session 51 of User zuul.
Jan 26 13:54:03 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:54:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:04.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:04 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:04 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:04 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:04.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:06.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:06 np0005596062 nova_compute[227313]: 2026-01-26 18:54:06.869 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:06 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:06 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:06 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:06.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:07 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 26 13:54:07 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1033253783' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 13:54:08 np0005596062 nova_compute[227313]: 2026-01-26 18:54:08.350 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:08.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:08 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:54:08 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:08 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:08 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:08.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:54:09.209 143929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:54:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:54:09.209 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:54:09 np0005596062 ovn_metadata_agent[143924]: 2026-01-26 18:54:09.209 143929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:54:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:10.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:10 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:10 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:10 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:11 np0005596062 nova_compute[227313]: 2026-01-26 18:54:11.922 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:12.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:12 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:12 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:54:12 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:12.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:54:13 np0005596062 ovs-vsctl[270144]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 13:54:13 np0005596062 nova_compute[227313]: 2026-01-26 18:54:13.352 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:13 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:54:14 np0005596062 virtqemud[226715]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 13:54:14 np0005596062 virtqemud[226715]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 13:54:14 np0005596062 virtqemud[226715]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 13:54:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:14.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:14 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: cache status {prefix=cache status} (starting...)
Jan 26 13:54:14 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: client ls {prefix=client ls} (starting...)
Jan 26 13:54:14 np0005596062 lvm[270489]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 13:54:14 np0005596062 lvm[270489]: VG ceph_vg0 finished
Jan 26 13:54:14 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:14 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:54:14 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:14.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:54:15 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: damage ls {prefix=damage ls} (starting...)
Jan 26 13:54:15 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 26 13:54:15 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1410907623' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 13:54:15 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: dump loads {prefix=dump loads} (starting...)
Jan 26 13:54:15 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 26 13:54:15 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3178980633' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 13:54:16 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 26 13:54:16 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3768659611' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 13:54:16 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 26 13:54:16 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 26 13:54:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:16.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:16 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: ops {prefix=ops} (starting...)
Jan 26 13:54:16 np0005596062 podman[270756]: 2026-01-26 18:54:16.850141444 +0000 UTC m=+0.056400293 container health_status db59b7e7812c031df187ba98a4e3f0ccee5811ea9d7569f8ffd28e00a277609e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/161351924' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 26 13:54:16 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4267249440' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 13:54:16 np0005596062 nova_compute[227313]: 2026-01-26 18:54:16.925 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:16 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:16 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:16 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:16.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 26 13:54:17 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4121245188' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 13:54:17 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: session ls {prefix=session ls} (starting...)
Jan 26 13:54:17 np0005596062 ceph-mds[83671]: mds.cephfs.compute-2.oqvedy asok_command: status {prefix=status} (starting...)
Jan 26 13:54:17 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 26 13:54:17 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1301377356' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/817126341' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2765675471' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1701034166' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 13:54:18 np0005596062 nova_compute[227313]: 2026-01-26 18:54:18.354 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:18.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/821172730' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1344358914' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 13:54:18 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:54:18 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:18 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:18 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:18.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 26 13:54:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/800407753' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 13:54:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 26 13:54:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2544439027' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 13:54:19 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 26 13:54:19 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2482555856' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 13:54:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 26 13:54:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3854198954' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 13:54:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:20.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 26 13:54:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/552095978' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 13:54:20 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:20 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:20 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:20.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:20 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 26 13:54:20 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2494265680' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126017536 unmapped: 47153152 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ab285800 session 0x5647aaf2a3c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ac12ac00 session 0x5647aaef25a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423258 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126025728 unmapped: 47144960 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.471342087s of 17.688423157s, submitted: 29
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126033920 unmapped: 47136768 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad735800 session 0x5647aaf21680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422202 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad3ebc00 session 0x5647aaf20780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126042112 unmapped: 47128576 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126042112 unmapped: 47128576 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126042112 unmapped: 47128576 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126042112 unmapped: 47128576 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126042112 unmapped: 47128576 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422202 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422202 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ab389000 session 0x5647aaef2000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422202 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.531313896s of 15.821657181s, submitted: 4
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad3ea800 session 0x5647ac0e7e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647af1be000 session 0x5647ade701e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ade9ac00 session 0x5647abcb05a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b9a45000/0x0/0x1bfc00000, data 0x117831d/0x1289000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ab38ac00 session 0x5647ade1b0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126050304 unmapped: 47120384 heap: 173170688 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ab389000 session 0x5647b0d494a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ade9ac00 session 0x5647ab3f7e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647af1be000 session 0x5647b059c1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad3ea800 session 0x5647ac0e9a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1517145 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647abdf7000 session 0x5647ade710e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ab389000 session 0x5647adae4000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8db0000/0x0/0x1bfc00000, data 0x1e0b3e1/0x1f1e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad028400 session 0x5647b08a32c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace8b000 session 0x5647ab3fc780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad171000 session 0x5647ab8210e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126066688 unmapped: 51306496 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1517950 data_alloc: 218103808 data_used: 5152768
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.166245461s of 10.000266075s, submitted: 50
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126656512 unmapped: 50716672 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace9c400 session 0x5647abcb0d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ade9b400 session 0x5647ade703c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace9c400 session 0x5647abe474a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaef1800 session 0x5647abe47e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b81d8000/0x0/0x1bfc00000, data 0x29e437f/0x2af6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaf07c00 session 0x5647ab821860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1617050 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1617050 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 50626560 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1617050 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1617050 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 50618368 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647abdf5c00 session 0x5647adde4960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126771200 unmapped: 50601984 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.415607452s of 21.554430008s, submitted: 10
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b8158000/0x0/0x1bfc00000, data 0x2a6437f/0x2b76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaef1800 session 0x5647ac0e6d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126771200 unmapped: 50601984 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaf07c00 session 0x5647ab81ef00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126771200 unmapped: 50601984 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aee36400 session 0x5647ade5cb40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126771200 unmapped: 50601984 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647abdf5c00 session 0x5647ad17a3c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1615457 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace9c400 session 0x5647ab37cf00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ade9b400 session 0x5647adae41e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126779392 unmapped: 50593792 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaef1800 session 0x5647aaf2b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b930c000/0x0/0x1bfc00000, data 0x18b230d/0x19c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aee7bc00 session 0x5647ab316f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 50585600 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad6fdc00 session 0x5647abd290e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b930c000/0x0/0x1bfc00000, data 0x18b230d/0x19c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126795776 unmapped: 50577408 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647abd05800 session 0x5647ac13e1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaef1800 session 0x5647aaf205a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353927 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647abd05800 session 0x5647ab3f61e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad6fdc00 session 0x5647abd290e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1ba624000/0x0/0x1bfc00000, data 0x59a2ab/0x6a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647b0f0f800 session 0x5647abd28000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace8b400 session 0x5647abd285a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaef1800 session 0x5647abd26d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.501795769s of 12.670096397s, submitted: 34
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353926 data_alloc: 218103808 data_used: 5152768
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647abd05800 session 0x5647abd27e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1ba626000/0x0/0x1bfc00000, data 0x59a249/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126803968 unmapped: 50569216 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace9d000 session 0x5647ade710e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ad02dc00 session 0x5647abc203c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1351789 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ac187c00 session 0x5647ab821860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647aaef1800 session 0x5647abe47e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ace89400 session 0x5647ab3fc780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 ms_handle_reset con 0x5647ade9a000 session 0x5647ab3f7e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1ba62b000/0x0/0x1bfc00000, data 0x59a103/0x6a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126812160 unmapped: 50561024 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348955 data_alloc: 218103808 data_used: 5148672
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 heartbeat osd_stat(store_statfs(0x1ba62b000/0x0/0x1bfc00000, data 0x59a103/0x6a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.818020821s of 11.096217155s, submitted: 46
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 handle_osd_map epochs [186,187], i have 186, src has [1,187]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 186 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126820352 unmapped: 50552832 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 187 ms_handle_reset con 0x5647ad6fdc00 session 0x5647b0d49860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1351281 data_alloc: 218103808 data_used: 5160960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 187 heartbeat osd_stat(store_statfs(0x1ba628000/0x0/0x1bfc00000, data 0x59bd8d/0x6a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 187 handle_osd_map epochs [187,188], i have 187, src has [1,188]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126828544 unmapped: 50544640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126836736 unmapped: 50536448 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 heartbeat osd_stat(store_statfs(0x1ba625000/0x0/0x1bfc00000, data 0x59d8cc/0x6a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355279 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126844928 unmapped: 50528256 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 188 handle_osd_map epochs [188,189], i have 188, src has [1,189]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 57.586193085s of 57.799587250s, submitted: 55
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 ms_handle_reset con 0x5647af974000 session 0x5647ab81ef00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 heartbeat osd_stat(store_statfs(0x1ba621000/0x0/0x1bfc00000, data 0x59f548/0x6ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360069 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 ms_handle_reset con 0x5647ace86000 session 0x5647ab81e780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 heartbeat osd_stat(store_statfs(0x1ba621000/0x0/0x1bfc00000, data 0x59f548/0x6ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 ms_handle_reset con 0x5647ad170800 session 0x5647abd28d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 ms_handle_reset con 0x5647ade9b400 session 0x5647b059d680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360069 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 heartbeat osd_stat(store_statfs(0x1ba621000/0x0/0x1bfc00000, data 0x59f548/0x6ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126853120 unmapped: 50520064 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 189 handle_osd_map epochs [189,190], i have 189, src has [1,190]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 50495488 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 50495488 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 190 ms_handle_reset con 0x5647ace89400 session 0x5647adae5a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 50495488 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 190 heartbeat osd_stat(store_statfs(0x1ba61f000/0x0/0x1bfc00000, data 0x5a11d2/0x6ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 50495488 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1361227 data_alloc: 218103808 data_used: 5169152
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.303149223s of 12.541453362s, submitted: 14
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 190 handle_osd_map epochs [190,191], i have 190, src has [1,191]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 50495488 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 191 ms_handle_reset con 0x5647b1bcd000 session 0x5647abd283c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126877696 unmapped: 50495488 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 191 heartbeat osd_stat(store_statfs(0x1ba1ab000/0x0/0x1bfc00000, data 0xa12e6a/0xb22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 50479104 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 191 handle_osd_map epochs [191,192], i have 191, src has [1,192]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400111 data_alloc: 218103808 data_used: 5177344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 192 heartbeat osd_stat(store_statfs(0x1ba1a8000/0x0/0x1bfc00000, data 0xa149a9/0xb25000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126902272 unmapped: 50470912 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400111 data_alloc: 218103808 data_used: 5177344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 192 handle_osd_map epochs [192,193], i have 192, src has [1,193]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.958933830s of 10.020713806s, submitted: 18
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 193 ms_handle_reset con 0x5647af1be800 session 0x5647ade714a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 193 heartbeat osd_stat(store_statfs(0x1ba616000/0x0/0x1bfc00000, data 0x5a6633/0x6b7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372507 data_alloc: 218103808 data_used: 5177344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 193 heartbeat osd_stat(store_statfs(0x1ba616000/0x0/0x1bfc00000, data 0x5a6633/0x6b7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 50454528 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 50446336 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 50446336 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375481 data_alloc: 218103808 data_used: 5177344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 50446336 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 50446336 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126926848 unmapped: 50446336 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126943232 unmapped: 50429952 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 50421760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375641 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ad734c00 session 0x5647ac2221e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ad735c00 session 0x5647adc9a000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126959616 unmapped: 50413568 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 52.690959930s of 52.743789673s, submitted: 21
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ade9b800 session 0x5647ac2234a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba613000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647af315000 session 0x5647b08a30e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 50405376 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 50405376 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1376258 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126967808 unmapped: 50405376 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ad029800 session 0x5647abc21a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 50388992 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba614000/0x0/0x1bfc00000, data 0x5a8172/0x6ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [0,1,0,2])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647a9aa7800 session 0x5647ac13ed20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ae86b800 session 0x5647b0d481e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647abd05800 session 0x5647ac171680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647af314800 session 0x5647ade5de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1401942 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1401942 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1401942 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1401942 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1401942 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1401942 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647af974800 session 0x5647ad17b0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ade9ac00 session 0x5647b08a2f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba3a1000/0x0/0x1bfc00000, data 0x81b172/0x92d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127746048 unmapped: 49627136 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 35.988471985s of 36.506675720s, submitted: 91
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647af975800 session 0x5647ab30b860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127688704 unmapped: 49684480 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ab389000 session 0x5647b0d49c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1403927 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127696896 unmapped: 49676288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba39f000/0x0/0x1bfc00000, data 0x81b182/0x92e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127696896 unmapped: 49676288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647aee36400 session 0x5647ad178780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127696896 unmapped: 49676288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ade9ac00 session 0x5647adf3af00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ab389000 session 0x5647ac13f680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127696896 unmapped: 49676288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127696896 unmapped: 49676288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1403159 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ae868400 session 0x5647adf3ba40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647aee36400 session 0x5647ac1710e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 127852544 unmapped: 49520640 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ad6fc800 session 0x5647adde4f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ab389000 session 0x5647ab316f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ad6fc800 session 0x5647ad17ba40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1ba134000/0x0/0x1bfc00000, data 0xa861e4/0xb9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 48840704 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 48840704 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ade9ac00 session 0x5647adda72c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ae868400 session 0x5647ab37d680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133791744 unmapped: 43581440 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647aee36400 session 0x5647aaf0cb40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ab389000 session 0x5647ac0e65a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 47513600 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1498148 data_alloc: 218103808 data_used: 5181440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 47513600 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 heartbeat osd_stat(store_statfs(0x1b993e000/0x0/0x1bfc00000, data 0x127d182/0x1390000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 47513600 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 47513600 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.460657120s of 13.904512405s, submitted: 117
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 129867776 unmapped: 47505408 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 ms_handle_reset con 0x5647ad02ac00 session 0x5647abd26780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 195 ms_handle_reset con 0x5647aaf06000 session 0x5647aaf0de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 47349760 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1477672 data_alloc: 218103808 data_used: 5189632
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 196 ms_handle_reset con 0x5647ad32e800 session 0x5647ac0e7e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 130039808 unmapped: 47333376 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 197 ms_handle_reset con 0x5647ac185400 session 0x5647b08a2780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 197 heartbeat osd_stat(store_statfs(0x1b9ba1000/0x0/0x1bfc00000, data 0x1015a88/0x112b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131538944 unmapped: 45834240 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647ac186800 session 0x5647aaf21680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647ad3ea000 session 0x5647aaf2b860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131596288 unmapped: 45776896 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647abdf7400 session 0x5647ab821a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131727360 unmapped: 45645824 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647ab389000 session 0x5647ac13fe00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647aaf06000 session 0x5647adf3a000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131751936 unmapped: 45621248 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514780 data_alloc: 218103808 data_used: 5189632
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647aee7b400 session 0x5647ac0e85a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131776512 unmapped: 45596672 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647abdf4000 session 0x5647ab3f74a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 ms_handle_reset con 0x5647ac185000 session 0x5647aaf0cf00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 199 ms_handle_reset con 0x5647abdf5c00 session 0x5647ade5c960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 199 ms_handle_reset con 0x5647ad02ac00 session 0x5647b0d49680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131547136 unmapped: 45826048 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 199 ms_handle_reset con 0x5647ad02d800 session 0x5647ab3f61e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 199 heartbeat osd_stat(store_statfs(0x1b9169000/0x0/0x1bfc00000, data 0x1a4afe7/0x1b64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131547136 unmapped: 45826048 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131555328 unmapped: 45817856 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131555328 unmapped: 45817856 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 200 heartbeat osd_stat(store_statfs(0x1b9166000/0x0/0x1bfc00000, data 0x1a4cb26/0x1b67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1579772 data_alloc: 218103808 data_used: 5206016
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131629056 unmapped: 45744128 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131629056 unmapped: 45744128 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 200 ms_handle_reset con 0x5647ade9bc00 session 0x5647ab37cb40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131629056 unmapped: 45744128 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131629056 unmapped: 45744128 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131629056 unmapped: 45744128 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1579772 data_alloc: 218103808 data_used: 5206016
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 200 heartbeat osd_stat(store_statfs(0x1b9166000/0x0/0x1bfc00000, data 0x1a4cb26/0x1b67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.316413879s of 17.166660309s, submitted: 199
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 201 ms_handle_reset con 0x5647ab38bc00 session 0x5647ade1ba40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131637248 unmapped: 45735936 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131653632 unmapped: 45719552 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 201 ms_handle_reset con 0x5647b1bcc800 session 0x5647abe47e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647ae86b800 session 0x5647b059cd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647ad3eb800 session 0x5647ac158b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647ad170400 session 0x5647ab821c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131735552 unmapped: 45637632 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647b0f0e400 session 0x5647ab821860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647ade9bc00 session 0x5647ac07e780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647ab38bc00 session 0x5647ade5dc20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647ad02d800 session 0x5647aaf20780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 ms_handle_reset con 0x5647b1bcc800 session 0x5647abe46d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1430440 data_alloc: 218103808 data_used: 5210112
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 heartbeat osd_stat(store_statfs(0x1ba5fb000/0x0/0x1bfc00000, data 0x5b65c2/0x6d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5fb000/0x0/0x1bfc00000, data 0x5b65c2/0x6d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433414 data_alloc: 218103808 data_used: 5210112
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131768320 unmapped: 45604864 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433574 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433574 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433574 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131792896 unmapped: 45580288 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.363340378s of 26.892532349s, submitted: 107
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d000 session 0x5647ab2ccd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131817472 unmapped: 45555712 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131817472 unmapped: 45555712 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af975400 session 0x5647ac159c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131817472 unmapped: 45555712 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1437602 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131825664 unmapped: 45547520 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647b1bcc400 session 0x5647ac158000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131833856 unmapped: 45539328 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131833856 unmapped: 45539328 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f6000/0x0/0x1bfc00000, data 0x5b81bb/0x6d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d000 session 0x5647b059c1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad02d800 session 0x5647abd26780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 131833856 unmapped: 45539328 heap: 177373184 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af975400 session 0x5647abd27a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647b1bcc800 session 0x5647abd27680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147144704 unmapped: 33906688 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1585474 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad3eb800 session 0x5647ab3174a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace89c00 session 0x5647ab81f680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d000 session 0x5647ade5da40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad02d800 session 0x5647abd261e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 132538368 unmapped: 48513024 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af975400 session 0x5647aaf201e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647b1bcc800 session 0x5647adde5e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace89c00 session 0x5647aaef3e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d000 session 0x5647adde52c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af975400 session 0x5647adf3ab40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ab38a400 session 0x5647ab0f61e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad3eb800 session 0x5647ab81f0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.831309319s of 10.063385010s, submitted: 28
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 132546560 unmapped: 48504832 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 142548992 unmapped: 38502400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ab38a400 session 0x5647ab81fe00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad02d800 session 0x5647ac0e6960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace89c00 session 0x5647adde50e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af975400 session 0x5647adf3af00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647a9aa6c00 session 0x5647aaef34a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133013504 unmapped: 48037888 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b8e01000/0x0/0x1bfc00000, data 0x1dab22d/0x1ecd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,2])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ac186400 session 0x5647adf3a780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647adcf4000 session 0x5647adc9a3c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af1be000 session 0x5647aaf0cb40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133013504 unmapped: 48037888 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1634572 data_alloc: 218103808 data_used: 5222400
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d000 session 0x5647ab37da40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad6fcc00 session 0x5647aaf0cf00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ac186400 session 0x5647ab3f74a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133316608 unmapped: 47734784 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647adcf4000 session 0x5647aaf20d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133316608 unmapped: 47734784 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133316608 unmapped: 47734784 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133316608 unmapped: 47734784 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647af1be000 session 0x5647aaf0d0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647abdf5400 session 0x5647abd294a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 133439488 unmapped: 47611904 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1688770 data_alloc: 234881024 data_used: 12181504
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b8ddf000/0x0/0x1bfc00000, data 0x1dcf1bb/0x1eef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [0,0,0,0,0,1])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 139083776 unmapped: 41967616 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 145850368 unmapped: 35201024 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 145858560 unmapped: 35192832 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 145858560 unmapped: 35192832 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 145866752 unmapped: 35184640 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1816102 data_alloc: 251658240 data_used: 28340224
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b8ddf000/0x0/0x1bfc00000, data 0x1dcf1bb/0x1eef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.501823425s of 13.976967812s, submitted: 81
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647aee7a400 session 0x5647ac171860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d800 session 0x5647adde5a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 145866752 unmapped: 35184640 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647abdf5400 session 0x5647adde4f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143499264 unmapped: 37552128 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143499264 unmapped: 37552128 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ab38a800 session 0x5647ade5dc20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143499264 unmapped: 37552128 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b97a6000/0x0/0x1bfc00000, data 0x140a149/0x1528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143499264 unmapped: 37552128 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666803 data_alloc: 234881024 data_used: 18268160
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ace9d000 session 0x5647b059de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad6fcc00 session 0x5647ade1ad20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143499264 unmapped: 37552128 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143515648 unmapped: 37535744 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b97a7000/0x0/0x1bfc00000, data 0x140a139/0x1527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ab38a800 session 0x5647ade5cd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b97a7000/0x0/0x1bfc00000, data 0x140a139/0x1527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134938624 unmapped: 46112768 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134938624 unmapped: 46112768 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134938624 unmapped: 46112768 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1458177 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134938624 unmapped: 46112768 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f9000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.420623779s of 11.048128128s, submitted: 80
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647aaef0000 session 0x5647b0d49860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 46104576 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ac186800 session 0x5647adae5a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f9000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f9000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f9000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457585 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ab38a400 session 0x5647ab37cd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647aaf96400 session 0x5647aaef25a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647aaef0000 session 0x5647ade1a960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8149/0x6d6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 134963200 unmapped: 46088192 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1459413 data_alloc: 218103808 data_used: 5214208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ae869800 session 0x5647ade1ab40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647add0e000 session 0x5647ade1b2c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135929856 unmapped: 45121536 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647abc10400 session 0x5647ac0e81e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 ms_handle_reset con 0x5647ad02c800 session 0x5647ab0f6000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 heartbeat osd_stat(store_statfs(0x1ba5f8000/0x0/0x1bfc00000, data 0x5b8139/0x6d5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135929856 unmapped: 45121536 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135929856 unmapped: 45121536 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 203 handle_osd_map epochs [203,204], i have 203, src has [1,204]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.965347290s of 12.074249268s, submitted: 29
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135938048 unmapped: 45113344 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647aaef0000 session 0x5647ab0f72c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1ba1db000/0x0/0x1bfc00000, data 0x9d5149/0xaf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1498717 data_alloc: 218103808 data_used: 5226496
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1ba1d7000/0x0/0x1bfc00000, data 0x9d6da2/0xaf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1498717 data_alloc: 218103808 data_used: 5226496
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1ba1d7000/0x0/0x1bfc00000, data 0x9d6da2/0xaf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1498717 data_alloc: 218103808 data_used: 5226496
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1ba1d7000/0x0/0x1bfc00000, data 0x9d6da2/0xaf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1498717 data_alloc: 218103808 data_used: 5226496
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1ba1d7000/0x0/0x1bfc00000, data 0x9d6da2/0xaf6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 135946240 unmapped: 45105152 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.435506821s of 17.446155548s, submitted: 3
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647b1bcd800 session 0x5647ab37c000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9ef3000/0x0/0x1bfc00000, data 0xcbbda2/0xddb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ae86ac00 session 0x5647aaf0d860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ace89000 session 0x5647ab0f7c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647aaf97c00 session 0x5647ab81ef00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647aaef0000 session 0x5647ac13e1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1539831 data_alloc: 218103808 data_used: 5226496
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ade9bc00 session 0x5647ab8201e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9ce9000/0x0/0x1bfc00000, data 0xec5da2/0xfe5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9ce9000/0x0/0x1bfc00000, data 0xec5da2/0xfe5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ac0e5400 session 0x5647abcb03c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1539831 data_alloc: 218103808 data_used: 5226496
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647b224fc00 session 0x5647ac158960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ac187000 session 0x5647ad17b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647aaef0000 session 0x5647abe474a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136093696 unmapped: 44957696 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ac0e5400 session 0x5647ab3f6d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ade9bc00 session 0x5647ab37c000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.304218292s of 10.361325264s, submitted: 10
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647b224fc00 session 0x5647ab0f72c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136249344 unmapped: 44802048 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136200192 unmapped: 44851200 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1610505 data_alloc: 234881024 data_used: 14344192
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9cc4000/0x0/0x1bfc00000, data 0xee9db2/0x100a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9cc4000/0x0/0x1bfc00000, data 0xee9db2/0x100a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ac185000 session 0x5647b0d49860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9cc4000/0x0/0x1bfc00000, data 0xee9db2/0x100a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ad02a400 session 0x5647adc9a960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ad02b800 session 0x5647abe46d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647aaef0000 session 0x5647ade5cd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ac0e5400 session 0x5647ab30b2c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606511 data_alloc: 234881024 data_used: 14340096
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9cc5000/0x0/0x1bfc00000, data 0xec5da2/0xfe5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 136208384 unmapped: 44843008 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b9ce9000/0x0/0x1bfc00000, data 0xec5da2/0xfe5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x4f2f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.494952202s of 11.794830322s, submitted: 6
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147390464 unmapped: 33660928 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 142991360 unmapped: 38060032 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1688149 data_alloc: 234881024 data_used: 14340096
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647b0f0ec00 session 0x5647ade70000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 144613376 unmapped: 36438016 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647aaef0000 session 0x5647aaef2000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 142917632 unmapped: 38133760 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b8d4e000/0x0/0x1bfc00000, data 0x1a50da2/0x1b70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x533f9c6), peers [0,1] op hist [0,0,0,0,0,2])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143015936 unmapped: 38035456 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143015936 unmapped: 38035456 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 ms_handle_reset con 0x5647ac0e5400 session 0x5647abd26000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143015936 unmapped: 38035456 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1696845 data_alloc: 234881024 data_used: 14372864
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 ms_handle_reset con 0x5647ac0e5c00 session 0x5647adf3bc20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143015936 unmapped: 38035456 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 143015936 unmapped: 38035456 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 heartbeat osd_stat(store_statfs(0x1b8d35000/0x0/0x1bfc00000, data 0x1a67a4f/0x1b88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x533f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 ms_handle_reset con 0x5647ad3ea400 session 0x5647ac158d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 ms_handle_reset con 0x5647af3d7000 session 0x5647adf3a1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1635737 data_alloc: 234881024 data_used: 10350592
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 ms_handle_reset con 0x5647ade9b400 session 0x5647ac07fc20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 ms_handle_reset con 0x5647af975800 session 0x5647adde4f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 heartbeat osd_stat(store_statfs(0x1b9154000/0x0/0x1bfc00000, data 0x164aa3f/0x176a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x533f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 205 handle_osd_map epochs [205,206], i have 205, src has [1,206]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.009008408s of 14.204324722s, submitted: 87
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac186c00 session 0x5647aaf201e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b914e000/0x0/0x1bfc00000, data 0x164c5f0/0x176f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x533f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1643841 data_alloc: 234881024 data_used: 10366976
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ad02a000 session 0x5647b08a34a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140533760 unmapped: 40517632 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac0e4000 session 0x5647adde5e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 141721600 unmapped: 39329792 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b7c12000/0x0/0x1bfc00000, data 0x19ea58e/0x1b0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac0e4000 session 0x5647aa7154a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b7c12000/0x0/0x1bfc00000, data 0x19ea58e/0x1b0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 141729792 unmapped: 39321600 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac186c00 session 0x5647adf3a780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 141729792 unmapped: 39321600 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b7c12000/0x0/0x1bfc00000, data 0x19ea58e/0x1b0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 141729792 unmapped: 39321600 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1672130 data_alloc: 234881024 data_used: 10371072
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ade9a000 session 0x5647ade1b2c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 141729792 unmapped: 39321600 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 43696128 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.767542839s of 10.121741295s, submitted: 37
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647b0f0f400 session 0x5647abd26b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514922 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514922 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514922 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514922 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514922 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1514922 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 42647552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647aaef5000 session 0x5647aaf0de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac0e4000 session 0x5647adc9b4a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac186c00 session 0x5647adf3ab40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ade9a000 session 0x5647adf3b2c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.454231262s of 31.457796097s, submitted: 1
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8ca1000/0x0/0x1bfc00000, data 0x95b58e/0xa7d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 139534336 unmapped: 41517056 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647b0f0e800 session 0x5647ac0e9a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647b0f0f400 session 0x5647adf3ad20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac0e4000 session 0x5647ab37cb40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac186c00 session 0x5647ade701e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140230656 unmapped: 40820736 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1598865 data_alloc: 218103808 data_used: 5242880
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0x126b59e/0x138e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ade9a000 session 0x5647abd294a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647b0f0e800 session 0x5647b0d48780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647b0f0f400 session 0x5647b0d483c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac0e4000 session 0x5647ade5d4a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140247040 unmapped: 40804352 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ac186c00 session 0x5647ab3fd0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140615680 unmapped: 40435712 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140623872 unmapped: 40427520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8364000/0x0/0x1bfc00000, data 0x12955d1/0x13ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1632235 data_alloc: 218103808 data_used: 8957952
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1632235 data_alloc: 218103808 data_used: 8957952
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8364000/0x0/0x1bfc00000, data 0x12955d1/0x13ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8364000/0x0/0x1bfc00000, data 0x12955d1/0x13ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.942320824s of 13.133686066s, submitted: 48
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 40419328 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ae869800 session 0x5647adae5c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140541952 unmapped: 40509440 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8340000/0x0/0x1bfc00000, data 0x12b95d1/0x13de000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140550144 unmapped: 40501248 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 140558336 unmapped: 40493056 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1665648 data_alloc: 234881024 data_used: 12873728
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b8340000/0x0/0x1bfc00000, data 0x12b95d1/0x13de000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147619840 unmapped: 33431552 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147636224 unmapped: 33415168 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b78e6000/0x0/0x1bfc00000, data 0x1d0b5d1/0x1e30000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147636224 unmapped: 33415168 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147636224 unmapped: 33415168 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147668992 unmapped: 33382400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1785668 data_alloc: 234881024 data_used: 17240064
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147668992 unmapped: 33382400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147668992 unmapped: 33382400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.859712601s of 11.091236115s, submitted: 72
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147668992 unmapped: 33382400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b78dc000/0x0/0x1bfc00000, data 0x1d155d1/0x1e3a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b78dc000/0x0/0x1bfc00000, data 0x1d155d1/0x1e3a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147668992 unmapped: 33382400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147668992 unmapped: 33382400 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1785684 data_alloc: 234881024 data_used: 17240064
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152600576 unmapped: 28450816 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b6b6c000/0x0/0x1bfc00000, data 0x2a8d5d1/0x2bb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152174592 unmapped: 28876800 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151887872 unmapped: 29163520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69f0000/0x0/0x1bfc00000, data 0x2c085d1/0x2d2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151887872 unmapped: 29163520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151887872 unmapped: 29163520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1911972 data_alloc: 234881024 data_used: 18878464
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151887872 unmapped: 29163520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151887872 unmapped: 29163520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151887872 unmapped: 29163520 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.695406914s of 10.552433968s, submitted: 132
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69f0000/0x0/0x1bfc00000, data 0x2c085d1/0x2d2d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153993216 unmapped: 27058176 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153993216 unmapped: 27058176 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1908968 data_alloc: 234881024 data_used: 18878464
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69cd000/0x0/0x1bfc00000, data 0x2c2c5d1/0x2d51000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153993216 unmapped: 27058176 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69cd000/0x0/0x1bfc00000, data 0x2c2c5d1/0x2d51000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1910916 data_alloc: 234881024 data_used: 18890752
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69ab000/0x0/0x1bfc00000, data 0x2c4e5d1/0x2d73000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69ab000/0x0/0x1bfc00000, data 0x2c4e5d1/0x2d73000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ab38b000 session 0x5647abd263c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647ae868800 session 0x5647aaf214a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152944640 unmapped: 28106752 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 ms_handle_reset con 0x5647af1bf000 session 0x5647ab30bc20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.563100815s of 11.475221634s, submitted: 23
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152969216 unmapped: 28082176 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1910156 data_alloc: 234881024 data_used: 18898944
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b69ab000/0x0/0x1bfc00000, data 0x2c4e5d1/0x2d73000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 207 ms_handle_reset con 0x5647ab285800 session 0x5647ac07e780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 207 ms_handle_reset con 0x5647abdf5400 session 0x5647b0d48f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152985600 unmapped: 28065792 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 207 ms_handle_reset con 0x5647ac186400 session 0x5647aaef2000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 207 ms_handle_reset con 0x5647abd04800 session 0x5647abc21a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166854656 unmapped: 14196736 heap: 181051392 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 208 ms_handle_reset con 0x5647ab285800 session 0x5647ac0e8000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 208 ms_handle_reset con 0x5647af1be400 session 0x5647ab8201e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169902080 unmapped: 28467200 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 208 handle_osd_map epochs [208,209], i have 208, src has [1,209]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 208 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 209 ms_handle_reset con 0x5647abdf5400 session 0x5647adf3b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169910272 unmapped: 28459008 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 209 heartbeat osd_stat(store_statfs(0x1b572d000/0x0/0x1bfc00000, data 0x3ec6ccd/0x3fef000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169943040 unmapped: 28426240 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2131957 data_alloc: 251658240 data_used: 34746368
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 ms_handle_reset con 0x5647ac186400 session 0x5647ac0e9680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169943040 unmapped: 28426240 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 ms_handle_reset con 0x5647af974400 session 0x5647aaf0da40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169943040 unmapped: 28426240 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 ms_handle_reset con 0x5647ab285800 session 0x5647aaef3860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 ms_handle_reset con 0x5647abdf5400 session 0x5647ab0f72c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 15K writes, 56K keys, 15K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 15K writes, 5008 syncs, 3.07 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2939 writes, 9027 keys, 2939 commit groups, 1.0 writes per commit group, ingest: 7.53 MB, 0.01 MB/s#012Interval WAL: 2939 writes, 1197 syncs, 2.46 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169943040 unmapped: 28426240 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169943040 unmapped: 28426240 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 heartbeat osd_stat(store_statfs(0x1b5729000/0x0/0x1bfc00000, data 0x3ec8942/0x3ff2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169975808 unmapped: 28393472 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2131957 data_alloc: 251658240 data_used: 34746368
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169975808 unmapped: 28393472 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 heartbeat osd_stat(store_statfs(0x1b5729000/0x0/0x1bfc00000, data 0x3ec8942/0x3ff2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169975808 unmapped: 28393472 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.801080704s of 12.602146149s, submitted: 53
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2125763 data_alloc: 251658240 data_used: 34750464
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b5728000/0x0/0x1bfc00000, data 0x3eca481/0x3ff5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2125763 data_alloc: 251658240 data_used: 34750464
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b5728000/0x0/0x1bfc00000, data 0x3eca481/0x3ff5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b5728000/0x0/0x1bfc00000, data 0x3eca481/0x3ff5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167763968 unmapped: 30605312 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.165513039s of 12.176384926s, submitted: 15
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167772160 unmapped: 30597120 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2125987 data_alloc: 251658240 data_used: 34750464
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aee37c00 session 0x5647aa7154a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f73000/0x0/0x1bfc00000, data 0x2681471/0x27ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f97000/0x0/0x1bfc00000, data 0x265d471/0x2787000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819795 data_alloc: 218103808 data_used: 9170944
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f97000/0x0/0x1bfc00000, data 0x265d471/0x2787000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819795 data_alloc: 218103808 data_used: 9170944
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f97000/0x0/0x1bfc00000, data 0x265d471/0x2787000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f97000/0x0/0x1bfc00000, data 0x265d471/0x2787000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aaef0000 session 0x5647ab2cd860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aa750800 session 0x5647ac0e7860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aa750800 session 0x5647abd29e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f97000/0x0/0x1bfc00000, data 0x265d471/0x2787000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aaef0000 session 0x5647abc20000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.107267380s of 13.952614784s, submitted: 45
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aee37c00 session 0x5647abcb05a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1820689 data_alloc: 218103808 data_used: 9170944
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6f97000/0x0/0x1bfc00000, data 0x265d471/0x2787000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647abdf5400 session 0x5647adde45a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647ab285800 session 0x5647abd285a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aa750800 session 0x5647ac0e7860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647b18a3c00 session 0x5647ab2cd860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647aaef0000 session 0x5647ab0f72c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647abdf5400 session 0x5647aaef3860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149053440 unmapped: 49315840 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 169705472 unmapped: 28663808 heap: 198369280 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b6796000/0x0/0x1bfc00000, data 0x2e5c4e3/0x2f88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [0,0,0,0,0,1,0,0,0,6,3])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 handle_osd_map epochs [211,212], i have 211, src has [1,212]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 ms_handle_reset con 0x5647adcf5000 session 0x5647ac0e7680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 212 ms_handle_reset con 0x5647aee37c00 session 0x5647ac0e9680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166027264 unmapped: 40304640 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166027264 unmapped: 40304640 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2029875 data_alloc: 251658240 data_used: 28508160
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166109184 unmapped: 40222720 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 213 ms_handle_reset con 0x5647aa750800 session 0x5647b0d49c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 213 ms_handle_reset con 0x5647adcf4800 session 0x5647abd294a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166232064 unmapped: 40099840 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 213 ms_handle_reset con 0x5647ade9a000 session 0x5647ac0e8f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 213 ms_handle_reset con 0x5647b0f0e800 session 0x5647b059de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 213 ms_handle_reset con 0x5647b0f0e800 session 0x5647abd28780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 161193984 unmapped: 45137920 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 213 handle_osd_map epochs [213,214], i have 213, src has [1,214]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 161202176 unmapped: 45129728 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 214 heartbeat osd_stat(store_statfs(0x1b5e77000/0x0/0x1bfc00000, data 0x374edfe/0x387c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 4.818826199s of 10.051407814s, submitted: 344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 214 ms_handle_reset con 0x5647aaef0000 session 0x5647aaf0d0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156385280 unmapped: 49946624 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 214 ms_handle_reset con 0x5647ad02a800 session 0x5647ac0e92c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1754369 data_alloc: 234881024 data_used: 13631488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152805376 unmapped: 53526528 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 214 ms_handle_reset con 0x5647ad028000 session 0x5647ac0e8b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152805376 unmapped: 53526528 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 214 heartbeat osd_stat(store_statfs(0x1b7f09000/0x0/0x1bfc00000, data 0x16e7ab7/0x1815000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152805376 unmapped: 53526528 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152322048 unmapped: 54009856 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 215 heartbeat osd_stat(store_statfs(0x1b7f05000/0x0/0x1bfc00000, data 0x16e964a/0x1818000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152338432 unmapped: 53993472 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1758543 data_alloc: 234881024 data_used: 13639680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152338432 unmapped: 53993472 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152338432 unmapped: 53993472 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152338432 unmapped: 53993472 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7f05000/0x0/0x1bfc00000, data 0x16e964a/0x1818000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152338432 unmapped: 53993472 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.499293327s of 10.553395271s, submitted: 49
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771797 data_alloc: 234881024 data_used: 14606336
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 53813248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7f03000/0x0/0x1bfc00000, data 0x16eb189/0x181b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac186400 session 0x5647abc20b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaef0000 session 0x5647abcb10e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7f03000/0x0/0x1bfc00000, data 0x16eb189/0x181b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1775797 data_alloc: 234881024 data_used: 15081472
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647adcf5400 session 0x5647abcb1680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7efd000/0x0/0x1bfc00000, data 0x16f01eb/0x1821000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152690688 unmapped: 53641216 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3ea400 session 0x5647abd28b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.772855759s of 10.019997597s, submitted: 14
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152707072 unmapped: 53624832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12b000 session 0x5647ac171860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7efd000/0x0/0x1bfc00000, data 0x16f01eb/0x1821000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,12])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647abdf6000 session 0x5647ab820d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1812185 data_alloc: 234881024 data_used: 15081472
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153321472 unmapped: 53010432 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaef0000 session 0x5647adae4960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7c12000/0x0/0x1bfc00000, data 0x19db1b2/0x1b0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153321472 unmapped: 53010432 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7c12000/0x0/0x1bfc00000, data 0x19db1eb/0x1b0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7c12000/0x0/0x1bfc00000, data 0x19db1eb/0x1b0c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153321472 unmapped: 53010432 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153321472 unmapped: 53010432 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12b000 session 0x5647ad17b860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7c11000/0x0/0x1bfc00000, data 0x19dc1eb/0x1b0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153501696 unmapped: 52830208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1853951 data_alloc: 234881024 data_used: 15073280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7736000/0x0/0x1bfc00000, data 0x1eb31eb/0x1fe4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647abdf5400 session 0x5647ab81ed20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647b1bccc00 session 0x5647b0d492c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647b0f0e000 session 0x5647b0d48000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.902896881s of 10.950631142s, submitted: 51
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1852527 data_alloc: 234881024 data_used: 15077376
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaef0000 session 0x5647b0d485a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647abdf5400 session 0x5647abd274a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12b000 session 0x5647ade1a960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153550848 unmapped: 52781056 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b79a2000/0x0/0x1bfc00000, data 0x1bc8199/0x1cf9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153722880 unmapped: 52609024 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156385280 unmapped: 49946624 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156385280 unmapped: 49946624 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ace8b000 session 0x5647adde5680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1657653 data_alloc: 234881024 data_used: 10334208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ae868000 session 0x5647adf3ab40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8b4d000/0x0/0x1bfc00000, data 0xaa1104/0xbcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1656893 data_alloc: 234881024 data_used: 10334208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8b4d000/0x0/0x1bfc00000, data 0xaa1104/0xbcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8b4d000/0x0/0x1bfc00000, data 0xaa1104/0xbcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150798336 unmapped: 55533568 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8b4d000/0x0/0x1bfc00000, data 0xaa1104/0xbcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.686045647s of 13.968108177s, submitted: 48
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152584192 unmapped: 53747712 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1738629 data_alloc: 234881024 data_used: 10637312
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152199168 unmapped: 54132736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b806a000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b806a000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1746747 data_alloc: 234881024 data_used: 10633216
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b806a000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad02c400 session 0x5647ab0f7e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647adcf4c00 session 0x5647ab0f7680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.074517250s of 10.435754776s, submitted: 70
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647adbef800 session 0x5647b059c3c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1748221 data_alloc: 234881024 data_used: 10633216
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b806a000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaef0000 session 0x5647b059c780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaf96000 session 0x5647ab3f74a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152616960 unmapped: 53714944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaf96000 session 0x5647ade1b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aaef0000 session 0x5647abd29860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152616960 unmapped: 53714944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1787546 data_alloc: 234881024 data_used: 10633216
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152616960 unmapped: 53714944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b45000/0x0/0x1bfc00000, data 0x1aaa166/0x1bd9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152616960 unmapped: 53714944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152616960 unmapped: 53714944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b45000/0x0/0x1bfc00000, data 0x1aaa166/0x1bd9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152616960 unmapped: 53714944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b45000/0x0/0x1bfc00000, data 0x1aaa166/0x1bd9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.126835823s of 10.728670120s, submitted: 54
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647add0e000 session 0x5647ac1701e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1786510 data_alloc: 234881024 data_used: 10633216
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647b1bcc000 session 0x5647ab37c000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8077000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152674304 unmapped: 53657600 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152674304 unmapped: 53657600 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8077000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152674304 unmapped: 53657600 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8077000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152674304 unmapped: 53657600 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ab38b000 session 0x5647ab30bc20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647b1bccc00 session 0x5647ade1a000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1748002 data_alloc: 234881024 data_used: 10637312
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8077000/0x0/0x1bfc00000, data 0x1578104/0x16a6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1748002 data_alloc: 234881024 data_used: 10637312
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.844432831s of 11.043012619s, submitted: 28
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8075000/0x0/0x1bfc00000, data 0x1579104/0x16a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8075000/0x0/0x1bfc00000, data 0x1579104/0x16a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8075000/0x0/0x1bfc00000, data 0x1579104/0x16a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8075000/0x0/0x1bfc00000, data 0x1579104/0x16a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1748310 data_alloc: 234881024 data_used: 10637312
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1748310 data_alloc: 234881024 data_used: 10637312
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8075000/0x0/0x1bfc00000, data 0x1579104/0x16a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647abdf5800 session 0x5647ade1ad20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac186400 session 0x5647adde43c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152649728 unmapped: 53682176 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8075000/0x0/0x1bfc00000, data 0x1579104/0x16a7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.814155579s of 13.882983208s, submitted: 1
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1593345 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12c800 session 0x5647ade1ab40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147513344 unmapped: 58818560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9022000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147513344 unmapped: 58818560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147513344 unmapped: 58818560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147513344 unmapped: 58818560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3ea400 session 0x5647adc9be00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647abdf5800 session 0x5647adc9ab40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12c800 session 0x5647adc9af00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac186400 session 0x5647adc9b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147513344 unmapped: 58818560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3ea400 session 0x5647adc9ba40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1682630 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8f0d000/0x0/0x1bfc00000, data 0x6e311d/0x811000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647b1bccc00 session 0x5647ade710e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fb000/0x0/0x1bfc00000, data 0x10f5156/0x1223000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1681406 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fb000/0x0/0x1bfc00000, data 0x10f5156/0x1223000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fb000/0x0/0x1bfc00000, data 0x10f5156/0x1223000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1681406 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fb000/0x0/0x1bfc00000, data 0x10f5156/0x1223000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.949623108s of 15.872022629s, submitted: 45
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa750800 session 0x5647ade70960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fb000/0x0/0x1bfc00000, data 0x10f5156/0x1223000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 147628032 unmapped: 58703872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151404544 unmapped: 54927360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fa000/0x0/0x1bfc00000, data 0x10f5179/0x1224000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151404544 unmapped: 54927360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1758419 data_alloc: 234881024 data_used: 15843328
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151404544 unmapped: 54927360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151404544 unmapped: 54927360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fa000/0x0/0x1bfc00000, data 0x10f5179/0x1224000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151437312 unmapped: 54894592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fa000/0x0/0x1bfc00000, data 0x10f5179/0x1224000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151470080 unmapped: 54861824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151470080 unmapped: 54861824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1758419 data_alloc: 234881024 data_used: 15843328
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151470080 unmapped: 54861824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151470080 unmapped: 54861824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151470080 unmapped: 54861824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b84fa000/0x0/0x1bfc00000, data 0x10f5179/0x1224000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151470080 unmapped: 54861824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.049396515s of 13.066587448s, submitted: 5
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156958720 unmapped: 49373184 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1833119 data_alloc: 234881024 data_used: 16207872
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157040640 unmapped: 49291264 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b01000/0x0/0x1bfc00000, data 0x1aed179/0x1c1c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7af7000/0x0/0x1bfc00000, data 0x1af7179/0x1c26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839823 data_alloc: 234881024 data_used: 16207872
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7af7000/0x0/0x1bfc00000, data 0x1af7179/0x1c26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839823 data_alloc: 234881024 data_used: 16207872
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7af7000/0x0/0x1bfc00000, data 0x1af7179/0x1c26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839823 data_alloc: 234881024 data_used: 16207872
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7af7000/0x0/0x1bfc00000, data 0x1af7179/0x1c26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157057024 unmapped: 49274880 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839823 data_alloc: 234881024 data_used: 16207872
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7af7000/0x0/0x1bfc00000, data 0x1af7179/0x1c26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839823 data_alloc: 234881024 data_used: 16207872
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.364290237s of 27.561794281s, submitted: 81
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647af315c00 session 0x5647ab3fd2c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157065216 unmapped: 49266688 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7af7000/0x0/0x1bfc00000, data 0x1af7179/0x1c26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ade9a400 session 0x5647ab3fd680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606595 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606595 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606595 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150126592 unmapped: 56205312 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606595 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606595 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150134784 unmapped: 56197120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1606595 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b901b000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150142976 unmapped: 56188928 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aee7b800 session 0x5647aaef34a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ae869c00 session 0x5647ac1592c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa750800 session 0x5647ac07fa40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ade9a400 session 0x5647ac0e81e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.986610413s of 34.106418610s, submitted: 32
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150159360 unmapped: 56172544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150167552 unmapped: 56164352 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150175744 unmapped: 56156160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150175744 unmapped: 56156160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150175744 unmapped: 56156160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150175744 unmapped: 56156160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150175744 unmapped: 56156160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150183936 unmapped: 56147968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150183936 unmapped: 56147968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150183936 unmapped: 56147968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150183936 unmapped: 56147968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150183936 unmapped: 56147968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150200320 unmapped: 56131584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150200320 unmapped: 56131584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150200320 unmapped: 56131584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149880832 unmapped: 56451072 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc handle_mgr_map Got map version 12
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/2716354406,v1:192.168.122.100:6801/2716354406]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 149995520 unmapped: 56336384 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1609541 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150003712 unmapped: 56328192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.149471283s of 63.159767151s, submitted: 3
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf104/0x6fd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ae869c00 session 0x5647aaef25a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150183936 unmapped: 56147968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3eb400 session 0x5647ab37c960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aee37800 session 0x5647ad17b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa750800 session 0x5647adde41e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3eb400 session 0x5647aa7154a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ade9a400 session 0x5647ab3f6d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150192128 unmapped: 56139776 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ae869c00 session 0x5647b0d49a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1712776 data_alloc: 218103808 data_used: 5283840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150200320 unmapped: 56131584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150200320 unmapped: 56131584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac184000 session 0x5647ab3f6b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa750800 session 0x5647b059cf00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150519808 unmapped: 55812096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 150536192 unmapped: 55795712 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b839b000/0x0/0x1bfc00000, data 0x1253137/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153772032 unmapped: 52559872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1801939 data_alloc: 234881024 data_used: 16949248
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153772032 unmapped: 52559872 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc handle_mgr_map Got map version 13
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/2716354406,v1:192.168.122.100:6801/2716354406]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b839b000/0x0/0x1bfc00000, data 0x1253137/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b839b000/0x0/0x1bfc00000, data 0x1253137/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b839b000/0x0/0x1bfc00000, data 0x1253137/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b839b000/0x0/0x1bfc00000, data 0x1253137/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1801939 data_alloc: 234881024 data_used: 16949248
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153845760 unmapped: 52486144 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b839b000/0x0/0x1bfc00000, data 0x1253137/0x1383000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.049152374s of 16.241672516s, submitted: 40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153649152 unmapped: 52682752 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1891537 data_alloc: 234881024 data_used: 18673664
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158932992 unmapped: 47398912 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7a137/0x1caa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1898509 data_alloc: 234881024 data_used: 19038208
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7a137/0x1caa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1898845 data_alloc: 234881024 data_used: 19046400
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7a137/0x1caa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.867920876s of 15.106770515s, submitted: 99
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1896929 data_alloc: 234881024 data_used: 19058688
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3eb400 session 0x5647ab37c960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ade9a400 session 0x5647adae5c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158621696 unmapped: 47710208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158646272 unmapped: 47685632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897749 data_alloc: 234881024 data_used: 19165184
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1897749 data_alloc: 234881024 data_used: 19165184
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158654464 unmapped: 47677440 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.547518730s of 15.565766335s, submitted: 6
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1903777 data_alloc: 234881024 data_used: 19595264
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a71000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a71000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1905217 data_alloc: 234881024 data_used: 19738624
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 47669248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158679040 unmapped: 47652864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7a73000/0x0/0x1bfc00000, data 0x1b7b137/0x1cab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158679040 unmapped: 47652864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158679040 unmapped: 47652864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158679040 unmapped: 47652864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.290778160s of 10.332545280s, submitted: 10
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad3ea400 session 0x5647aaef2f00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad6fd800 session 0x5647ab821a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1904593 data_alloc: 234881024 data_used: 19746816
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158679040 unmapped: 47652864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647af3d7000 session 0x5647ab81fa40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8ffc000/0x0/0x1bfc00000, data 0x5f3127/0x722000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b9021000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1624276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152010752 unmapped: 54321152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.870122910s of 32.034774780s, submitted: 43
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647abdf5800 session 0x5647ac158d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1736248 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ab388400 session 0x5647b059cb40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa991c00 session 0x5647b059cd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ae869c00 session 0x5647b059c960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa991c00 session 0x5647b059cf00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1736248 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 151224320 unmapped: 55107584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1838008 data_alloc: 234881024 data_used: 19517440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1838008 data_alloc: 234881024 data_used: 19517440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8211000/0x0/0x1bfc00000, data 0x13e00f4/0x150d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x64df9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153632768 unmapped: 52699136 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.445199966s of 20.548646927s, submitted: 22
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163889152 unmapped: 42442752 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163069952 unmapped: 43261952 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1937660 data_alloc: 234881024 data_used: 21577728
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7347000/0x0/0x1bfc00000, data 0x1e9a0f4/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163069952 unmapped: 43261952 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163069952 unmapped: 43261952 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163069952 unmapped: 43261952 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7347000/0x0/0x1bfc00000, data 0x1e9a0f4/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163069952 unmapped: 43261952 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163069952 unmapped: 43261952 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1935472 data_alloc: 234881024 data_used: 21577728
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163094528 unmapped: 43237376 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7326000/0x0/0x1bfc00000, data 0x1ebb0f4/0x1fe8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163094528 unmapped: 43237376 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163094528 unmapped: 43237376 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7317000/0x0/0x1bfc00000, data 0x1eca0f4/0x1ff7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163094528 unmapped: 43237376 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ab388400 session 0x5647aaf0de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163094528 unmapped: 43237376 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.336358070s of 12.641411781s, submitted: 127
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7317000/0x0/0x1bfc00000, data 0x1eca0f4/0x1ff7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1636276 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647af1bf400 session 0x5647ade71c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153862144 unmapped: 52469760 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153862144 unmapped: 52469760 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647af314800 session 0x5647adae4000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aee37000 session 0x5647adf3b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153862144 unmapped: 52469760 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa991c00 session 0x5647ade1b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154042368 unmapped: 52289536 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12b000 session 0x5647ac159c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa990000 session 0x5647ab3fc780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b83cd000/0x0/0x1bfc00000, data 0xe140f4/0xf41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154042368 unmapped: 52289536 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ad6fd000 session 0x5647adde4b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa750800 session 0x5647adde5a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b83cd000/0x0/0x1bfc00000, data 0xe140f4/0xf41000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1710397 data_alloc: 218103808 data_used: 5279744
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154230784 unmapped: 52101120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154230784 unmapped: 52101120 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b83a8000/0x0/0x1bfc00000, data 0xe38117/0xf66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771837 data_alloc: 234881024 data_used: 11943936
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b83a8000/0x0/0x1bfc00000, data 0xe38117/0xf66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771837 data_alloc: 234881024 data_used: 11943936
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155082752 unmapped: 51249152 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.311281204s of 17.433422089s, submitted: 35
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158244864 unmapped: 48087040 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7ba4000/0x0/0x1bfc00000, data 0x162e117/0x175c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b8a000/0x0/0x1bfc00000, data 0x164e117/0x177c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1851657 data_alloc: 234881024 data_used: 13291520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b8a000/0x0/0x1bfc00000, data 0x164e117/0x177c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7b8a000/0x0/0x1bfc00000, data 0x164e117/0x177c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1851673 data_alloc: 234881024 data_used: 13291520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa991c00 session 0x5647aaf214a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa990000 session 0x5647abd26d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157564928 unmapped: 48766976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.054994583s of 10.294936180s, submitted: 103
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ab389800 session 0x5647b059d680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649073 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b8c11000/0x0/0x1bfc00000, data 0x5cf0f4/0x6fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152633344 unmapped: 53698560 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.853782654s of 35.227573395s, submitted: 23
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647af3d6800 session 0x5647ab81f0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7e54000/0x0/0x1bfc00000, data 0x138d0f4/0x14ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1758067 data_alloc: 218103808 data_used: 3444736
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ae869400 session 0x5647abcb01e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7e54000/0x0/0x1bfc00000, data 0x138d0f4/0x14ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aa990000 session 0x5647ade71860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aee36000 session 0x5647abd270e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ade9a000 session 0x5647ab3fde00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155484160 unmapped: 50847744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155492352 unmapped: 50839552 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819799 data_alloc: 234881024 data_used: 11005952
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154828800 unmapped: 51503104 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7e2e000/0x0/0x1bfc00000, data 0x13b1127/0x14e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1858519 data_alloc: 234881024 data_used: 14893056
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7e2e000/0x0/0x1bfc00000, data 0x13b1127/0x14e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7e2e000/0x0/0x1bfc00000, data 0x13b1127/0x14e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1858519 data_alloc: 234881024 data_used: 14893056
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156614656 unmapped: 49717248 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.062120438s of 18.189598083s, submitted: 35
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 164397056 unmapped: 41934848 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157736960 unmapped: 48594944 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b75c8000/0x0/0x1bfc00000, data 0x1c11127/0x1d40000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158597120 unmapped: 47734784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158597120 unmapped: 47734784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b752d000/0x0/0x1bfc00000, data 0x1ca3127/0x1dd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1946061 data_alloc: 234881024 data_used: 15290368
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158629888 unmapped: 47702016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158629888 unmapped: 47702016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158638080 unmapped: 47693824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158638080 unmapped: 47693824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1938785 data_alloc: 234881024 data_used: 15290368
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b751b000/0x0/0x1bfc00000, data 0x1cc4127/0x1df3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b751b000/0x0/0x1bfc00000, data 0x1cc4127/0x1df3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.988930702s of 13.904872894s, submitted: 97
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1939017 data_alloc: 234881024 data_used: 15290368
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7515000/0x0/0x1bfc00000, data 0x1cca127/0x1df9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7515000/0x0/0x1bfc00000, data 0x1cca127/0x1df9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158531584 unmapped: 47800320 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647aee7bc00 session 0x5647abd29680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 ms_handle_reset con 0x5647ac12b000 session 0x5647b0d483c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1944589 data_alloc: 234881024 data_used: 15278080
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7506000/0x0/0x1bfc00000, data 0x1cd8127/0x1e07000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b7506000/0x0/0x1bfc00000, data 0x1cd8127/0x1e07000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.635178566s of 10.499447823s, submitted: 33
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1943291 data_alloc: 234881024 data_used: 15286272
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159563776 unmapped: 46768128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159563776 unmapped: 46768128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 217 ms_handle_reset con 0x5647aee36800 session 0x5647adae4960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 217 ms_handle_reset con 0x5647b0576400 session 0x5647b059d860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159588352 unmapped: 46743552 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159612928 unmapped: 46718976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159629312 unmapped: 46702592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 217 heartbeat osd_stat(store_statfs(0x1b7503000/0x0/0x1bfc00000, data 0x1cd9d80/0x1e0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1945343 data_alloc: 234881024 data_used: 15462400
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 217 heartbeat osd_stat(store_statfs(0x1b7503000/0x0/0x1bfc00000, data 0x1cd9d80/0x1e0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 217 heartbeat osd_stat(store_statfs(0x1b7503000/0x0/0x1bfc00000, data 0x1cd9d80/0x1e0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1945343 data_alloc: 234881024 data_used: 15462400
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 217 handle_osd_map epochs [217,218], i have 217, src has [1,218]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.426203728s of 10.437616348s, submitted: 3
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159637504 unmapped: 46694400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b7500000/0x0/0x1bfc00000, data 0x1cdba2d/0x1e0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159768576 unmapped: 46563328 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1956461 data_alloc: 234881024 data_used: 16179200
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159768576 unmapped: 46563328 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b74ff000/0x0/0x1bfc00000, data 0x1cdba2d/0x1e0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b74ff000/0x0/0x1bfc00000, data 0x1cdba2d/0x1e0d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1963451 data_alloc: 234881024 data_used: 16187392
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b74fd000/0x0/0x1bfc00000, data 0x1cdd56c/0x1e10000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b74f9000/0x0/0x1bfc00000, data 0x1ce256c/0x1e15000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1963451 data_alloc: 234881024 data_used: 16187392
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159875072 unmapped: 46456832 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647aee7a400 session 0x5647ac1592c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.774030685s of 15.034091949s, submitted: 74
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647b0577400 session 0x5647adae5860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647b1bccc00 session 0x5647adae5c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680282 data_alloc: 218103808 data_used: 1691648
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1680602 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152068096 unmapped: 54263808 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 43.198913574s of 43.296794891s, submitted: 41
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ae868c00 session 0x5647adde54a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647aee36800 session 0x5647b059d4a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ab284c00 session 0x5647ab3f6d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ace88800 session 0x5647ade70b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152559616 unmapped: 53772288 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647abdf4400 session 0x5647ade5d860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1724967 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b872a000/0x0/0x1bfc00000, data 0xab3539/0xbe4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1724967 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b872a000/0x0/0x1bfc00000, data 0xab3539/0xbe4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1724967 data_alloc: 218103808 data_used: 1699840
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647aa934800 session 0x5647ac0e9680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b872a000/0x0/0x1bfc00000, data 0xab3539/0xbe4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647abc10000 session 0x5647abcb0d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152567808 unmapped: 53764096 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b872a000/0x0/0x1bfc00000, data 0xab3539/0xbe4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ad02b000 session 0x5647ab37d680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.393449783s of 14.481057167s, submitted: 26
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647aee36000 session 0x5647abcb01e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b872a000/0x0/0x1bfc00000, data 0xab3539/0xbe4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152764416 unmapped: 53567488 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152764416 unmapped: 53567488 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1732303 data_alloc: 218103808 data_used: 1703936
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 18K writes, 67K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.02 MB/s#012Cumulative WAL: 18K writes, 6126 syncs, 2.97 writes per sync, written: 0.06 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2834 writes, 10K keys, 2834 commit groups, 1.0 writes per commit group, ingest: 11.09 MB, 0.02 MB/s#012Interval WAL: 2834 writes, 1118 syncs, 2.53 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152764416 unmapped: 53567488 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152895488 unmapped: 53436416 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152895488 unmapped: 53436416 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152895488 unmapped: 53436416 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8704000/0x0/0x1bfc00000, data 0xad756c/0xc0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152895488 unmapped: 53436416 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759023 data_alloc: 218103808 data_used: 5472256
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152895488 unmapped: 53436416 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc ms_handle_reset ms_handle_reset con 0x5647aaef0c00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2716354406
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2716354406,v1:192.168.122.100:6801/2716354406]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: mgrc handle_mgr_configure stats_period=5
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152535040 unmapped: 53796864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8704000/0x0/0x1bfc00000, data 0xad756c/0xc0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152535040 unmapped: 53796864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152535040 unmapped: 53796864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152535040 unmapped: 53796864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8704000/0x0/0x1bfc00000, data 0xad756c/0xc0a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759023 data_alloc: 218103808 data_used: 5472256
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152535040 unmapped: 53796864 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.159555435s of 13.208255768s, submitted: 14
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153075712 unmapped: 53256192 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155303936 unmapped: 51027968 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 52035584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 52035584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1828249 data_alloc: 218103808 data_used: 5730304
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f42000/0x0/0x1bfc00000, data 0x129856c/0x13cb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 52035584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 52035584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154296320 unmapped: 52035584 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f42000/0x0/0x1bfc00000, data 0x129856c/0x13cb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154165248 unmapped: 52166656 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1827725 data_alloc: 218103808 data_used: 5730304
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.726887703s of 11.069490433s, submitted: 66
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ab388400 session 0x5647aaf0d0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ad3ea000 session 0x5647ac1705a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1825785 data_alloc: 218103808 data_used: 5730304
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1825785 data_alloc: 218103808 data_used: 5730304
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ae869800 session 0x5647adae4d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154247168 unmapped: 52084736 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154255360 unmapped: 52076544 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154271744 unmapped: 52060160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154271744 unmapped: 52060160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1827169 data_alloc: 218103808 data_used: 5808128
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154271744 unmapped: 52060160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.127736092s of 14.146677971s, submitted: 5
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154271744 unmapped: 52060160 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154353664 unmapped: 51978240 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [0,0,0,0,0,0,3])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154386432 unmapped: 51945472 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154484736 unmapped: 51847168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1829569 data_alloc: 218103808 data_used: 5869568
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154525696 unmapped: 51806208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154525696 unmapped: 51806208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154525696 unmapped: 51806208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f12000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839085 data_alloc: 218103808 data_used: 6713344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f10000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f10000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.032835960s of 12.983123779s, submitted: 341
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1839085 data_alloc: 218103808 data_used: 6713344
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154533888 unmapped: 51798016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f10000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154542080 unmapped: 51789824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154542080 unmapped: 51789824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154542080 unmapped: 51789824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b7f10000/0x0/0x1bfc00000, data 0x12c956c/0x13fc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ad171800 session 0x5647adae4000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647b0577800 session 0x5647ade71680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ab388400 session 0x5647ab81fe00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152395776 unmapped: 53936128 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152403968 unmapped: 53927936 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152412160 unmapped: 53919744 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152420352 unmapped: 53911552 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152420352 unmapped: 53911552 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 26 13:54:21 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/350107619' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152420352 unmapped: 53911552 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152420352 unmapped: 53911552 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152428544 unmapped: 53903360 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152436736 unmapped: 53895168 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152444928 unmapped: 53886976 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152453120 unmapped: 53878784 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152461312 unmapped: 53870592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152461312 unmapped: 53870592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152461312 unmapped: 53870592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152461312 unmapped: 53870592 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152469504 unmapped: 53862400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152469504 unmapped: 53862400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152469504 unmapped: 53862400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152469504 unmapped: 53862400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152469504 unmapped: 53862400 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152477696 unmapped: 53854208 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152485888 unmapped: 53846016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152485888 unmapped: 53846016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152485888 unmapped: 53846016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152485888 unmapped: 53846016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152485888 unmapped: 53846016 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152494080 unmapped: 53837824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152494080 unmapped: 53837824 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152502272 unmapped: 53829632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152502272 unmapped: 53829632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1694658 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152502272 unmapped: 53829632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152502272 unmapped: 53829632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152502272 unmapped: 53829632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8c08000/0x0/0x1bfc00000, data 0x5d4539/0x705000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 99.352607727s of 99.513916016s, submitted: 52
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152502272 unmapped: 53829632 heap: 206331904 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647b0576400 session 0x5647ab8201e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152510464 unmapped: 58023936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1750796 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152510464 unmapped: 58023936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8486000/0x0/0x1bfc00000, data 0xd57539/0xe88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ae868800 session 0x5647adc9a3c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ace9d000 session 0x5647adf3b2c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1750796 data_alloc: 218103808 data_used: 1761280
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647b0576c00 session 0x5647abd272c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 ms_handle_reset con 0x5647ab388400 session 0x5647adf3b680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8486000/0x0/0x1bfc00000, data 0xd57539/0xe88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8485000/0x0/0x1bfc00000, data 0xd5755c/0xe89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 152518656 unmapped: 58015744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153698304 unmapped: 56836096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1809102 data_alloc: 218103808 data_used: 9625600
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153698304 unmapped: 56836096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153698304 unmapped: 56836096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153698304 unmapped: 56836096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8485000/0x0/0x1bfc00000, data 0xd5755c/0xe89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153706496 unmapped: 56827904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153706496 unmapped: 56827904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1809102 data_alloc: 218103808 data_used: 9625600
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153706496 unmapped: 56827904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153706496 unmapped: 56827904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153714688 unmapped: 56819712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153714688 unmapped: 56819712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8485000/0x0/0x1bfc00000, data 0xd5755c/0xe89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 153722880 unmapped: 56811520 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.494569778s of 21.571447372s, submitted: 17
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817756 data_alloc: 218103808 data_used: 9641984
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154353664 unmapped: 56180736 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154353664 unmapped: 56180736 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154353664 unmapped: 56180736 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154353664 unmapped: 56180736 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154353664 unmapped: 56180736 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819894 data_alloc: 218103808 data_used: 9707520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819894 data_alloc: 218103808 data_used: 9707520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154361856 unmapped: 56172544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 56164352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 56164352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 56164352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819894 data_alloc: 218103808 data_used: 9707520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 56164352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154370048 unmapped: 56164352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154378240 unmapped: 56156160 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154378240 unmapped: 56156160 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154378240 unmapped: 56156160 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1819894 data_alloc: 218103808 data_used: 9707520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154386432 unmapped: 56147968 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154386432 unmapped: 56147968 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154386432 unmapped: 56147968 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b8390000/0x0/0x1bfc00000, data 0xe4c55c/0xf7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154394624 unmapped: 56139776 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154435584 unmapped: 56098816 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1820374 data_alloc: 234881024 data_used: 9814016
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154435584 unmapped: 56098816 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 154435584 unmapped: 56098816 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.659635544s of 26.826833725s, submitted: 10
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 220 ms_handle_reset con 0x5647b1bcc400 session 0x5647ab820d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 220 ms_handle_reset con 0x5647abdf6000 session 0x5647adf3a5a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 220 heartbeat osd_stat(store_statfs(0x1b838b000/0x0/0x1bfc00000, data 0xe4e1c4/0xf82000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 159129600 unmapped: 51404800 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 220 heartbeat osd_stat(store_statfs(0x1b838b000/0x0/0x1bfc00000, data 0xe4e1c4/0xf82000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 220 ms_handle_reset con 0x5647af3d6800 session 0x5647adf3b0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 221 ms_handle_reset con 0x5647aa991000 session 0x5647abd270e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156385280 unmapped: 54149120 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 221 heartbeat osd_stat(store_statfs(0x1b7fd0000/0x0/0x1bfc00000, data 0x1207e71/0x133d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 ms_handle_reset con 0x5647ab388400 session 0x5647ac13f860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156459008 unmapped: 54075392 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 ms_handle_reset con 0x5647abdf6000 session 0x5647b059c5a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1868340 data_alloc: 234881024 data_used: 10665984
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 ms_handle_reset con 0x5647af3d6800 session 0x5647aaef34a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156459008 unmapped: 54075392 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 ms_handle_reset con 0x5647b1bcc400 session 0x5647ab3fd0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 heartbeat osd_stat(store_statfs(0x1b7fcc000/0x0/0x1bfc00000, data 0x1209ae6/0x1340000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156459008 unmapped: 54075392 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156459008 unmapped: 54075392 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156459008 unmapped: 54075392 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156459008 unmapped: 54075392 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 heartbeat osd_stat(store_statfs(0x1b7fcc000/0x0/0x1bfc00000, data 0x1209ae6/0x1340000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1868340 data_alloc: 234881024 data_used: 10665984
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156557312 unmapped: 53977088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156557312 unmapped: 53977088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7fca000/0x0/0x1bfc00000, data 0x120b625/0x1343000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647abc10400 session 0x5647ade1a780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647ab388400 session 0x5647adde50e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647abdf6000 session 0x5647adae4b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647af3d6800 session 0x5647adae4780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.619624138s of 10.718444824s, submitted: 39
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156573696 unmapped: 53960704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647aea1c400 session 0x5647adde5e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647b1bcc400 session 0x5647adc9b860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647ab388400 session 0x5647abd27c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647abdf6000 session 0x5647ab2ccd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647aea1c400 session 0x5647abc21a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156581888 unmapped: 53952512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156581888 unmapped: 53952512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1872664 data_alloc: 234881024 data_used: 10665984
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156581888 unmapped: 53952512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647ad735000 session 0x5647abcb10e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7fc9000/0x0/0x1bfc00000, data 0x120b697/0x1345000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156581888 unmapped: 53952512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647b0f0f400 session 0x5647abcb1a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156581888 unmapped: 53952512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647ab388400 session 0x5647aaef32c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156581888 unmapped: 53952512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647abdf6000 session 0x5647ab8205a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 156729344 unmapped: 53805056 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1889226 data_alloc: 234881024 data_used: 12353536
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158572544 unmapped: 51961856 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7fa4000/0x0/0x1bfc00000, data 0x122f6a7/0x136a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7fa4000/0x0/0x1bfc00000, data 0x122f6a7/0x136a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7fa4000/0x0/0x1bfc00000, data 0x122f6a7/0x136a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1899626 data_alloc: 234881024 data_used: 13844480
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7fa4000/0x0/0x1bfc00000, data 0x122f6a7/0x136a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1899786 data_alloc: 234881024 data_used: 13848576
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.907283783s of 18.950902939s, submitted: 6
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158588928 unmapped: 51945472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 51871744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 51871744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 51871744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1913779 data_alloc: 234881024 data_used: 14028800
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 51871744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158662656 unmapped: 51871744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158900224 unmapped: 51634176 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158900224 unmapped: 51634176 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158900224 unmapped: 51634176 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1914355 data_alloc: 234881024 data_used: 14028800
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158900224 unmapped: 51634176 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158900224 unmapped: 51634176 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158900224 unmapped: 51634176 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.765879631s of 11.812744141s, submitted: 13
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1913971 data_alloc: 234881024 data_used: 14032896
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1913795 data_alloc: 234881024 data_used: 14032896
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1913795 data_alloc: 234881024 data_used: 14032896
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b7eee000/0x0/0x1bfc00000, data 0x12e56a7/0x1420000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 ms_handle_reset con 0x5647ae868400 session 0x5647ac0e85a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 158982144 unmapped: 51552256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.768287659s of 16.782152176s, submitted: 4
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 224 ms_handle_reset con 0x5647ae869000 session 0x5647abc20d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 224 ms_handle_reset con 0x5647ae86a800 session 0x5647ab30b860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 224 ms_handle_reset con 0x5647ac12ac00 session 0x5647adc9a1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 224 heartbeat osd_stat(store_statfs(0x1b7eea000/0x0/0x1bfc00000, data 0x12e7300/0x1423000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1989090 data_alloc: 234881024 data_used: 15421440
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166092800 unmapped: 44441600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 224 ms_handle_reset con 0x5647ab388400 session 0x5647ade1a1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 225 ms_handle_reset con 0x5647af3d6000 session 0x5647abe46d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162988032 unmapped: 47546368 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 225 handle_osd_map epochs [225,226], i have 225, src has [1,226]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 225 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 226 ms_handle_reset con 0x5647ace9d400 session 0x5647ad17ba40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 226 heartbeat osd_stat(store_statfs(0x1b61bc000/0x0/0x1bfc00000, data 0x300fc32/0x314f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 163078144 unmapped: 47456256 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 226 ms_handle_reset con 0x5647af315c00 session 0x5647abd270e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 226 ms_handle_reset con 0x5647af315c00 session 0x5647adc9b0e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 226 ms_handle_reset con 0x5647ab388400 session 0x5647ade5c780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162684928 unmapped: 47849472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162693120 unmapped: 47841280 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 ms_handle_reset con 0x5647ac12ac00 session 0x5647ab81e780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1943033 data_alloc: 234881024 data_used: 15433728
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162717696 unmapped: 47816704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 heartbeat osd_stat(store_statfs(0x1b61bb000/0x0/0x1bfc00000, data 0x30118fb/0x3152000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162717696 unmapped: 47816704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 ms_handle_reset con 0x5647ad735000 session 0x5647adde5680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 ms_handle_reset con 0x5647aea1c400 session 0x5647abd26d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162717696 unmapped: 47816704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 ms_handle_reset con 0x5647ab388400 session 0x5647abcb1680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 heartbeat osd_stat(store_statfs(0x1b7fbe000/0x0/0x1bfc00000, data 0x1212869/0x134f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162717696 unmapped: 47816704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162717696 unmapped: 47816704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1923808 data_alloc: 234881024 data_used: 14585856
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 227 handle_osd_map epochs [227,228], i have 227, src has [1,228]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.041545868s of 10.527981758s, submitted: 113
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162725888 unmapped: 47808512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 162725888 unmapped: 47808512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 ms_handle_reset con 0x5647ae868000 session 0x5647abcb05a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 160980992 unmapped: 49553408 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 160980992 unmapped: 49553408 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 ms_handle_reset con 0x5647ac0e5000 session 0x5647b059de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 ms_handle_reset con 0x5647aee7ac00 session 0x5647ade701e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 heartbeat osd_stat(store_statfs(0x1b8371000/0x0/0x1bfc00000, data 0xe5e062/0xf9c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 ms_handle_reset con 0x5647ae868400 session 0x5647ade71860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 heartbeat osd_stat(store_statfs(0x1b8371000/0x0/0x1bfc00000, data 0xe5e062/0xf9c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 heartbeat osd_stat(store_statfs(0x1b8371000/0x0/0x1bfc00000, data 0xe5e062/0xf9c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1756465 data_alloc: 218103808 data_used: 1806336
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 230 heartbeat osd_stat(store_statfs(0x1b8be7000/0x0/0x1bfc00000, data 0x5e7b9a/0x726000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 230 heartbeat osd_stat(store_statfs(0x1b8be7000/0x0/0x1bfc00000, data 0x5e7b9a/0x726000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759439 data_alloc: 218103808 data_used: 1806336
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759439 data_alloc: 218103808 data_used: 1806336
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157270016 unmapped: 53264384 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157278208 unmapped: 53256192 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157278208 unmapped: 53256192 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157278208 unmapped: 53256192 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157278208 unmapped: 53256192 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157278208 unmapped: 53256192 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157286400 unmapped: 53248000 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157294592 unmapped: 53239808 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157294592 unmapped: 53239808 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157294592 unmapped: 53239808 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157294592 unmapped: 53239808 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157302784 unmapped: 53231616 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157310976 unmapped: 53223424 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157310976 unmapped: 53223424 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157310976 unmapped: 53223424 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157310976 unmapped: 53223424 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157310976 unmapped: 53223424 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759599 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be4000/0x0/0x1bfc00000, data 0x5e96d9/0x729000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 73.682220459s of 73.853164673s, submitted: 81
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647ab284400 session 0x5647abcb1680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1762329 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b8be3000/0x0/0x1bfc00000, data 0x5e974b/0x72b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157319168 unmapped: 53215232 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157974528 unmapped: 52559872 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647ad6fdc00 session 0x5647ab81e780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647add0f000 session 0x5647adc9a1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647ad02d400 session 0x5647ac0e85a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647ad734400 session 0x5647aaef32c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647aa751800 session 0x5647ab2ccd20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157982720 unmapped: 52551680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83e8000/0x0/0x1bfc00000, data 0xde474b/0xf26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1831982 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157982720 unmapped: 52551680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83e8000/0x0/0x1bfc00000, data 0xde474b/0xf26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157982720 unmapped: 52551680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157982720 unmapped: 52551680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647af315000 session 0x5647adde5e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157982720 unmapped: 52551680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647b06bf800 session 0x5647adae4780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83e8000/0x0/0x1bfc00000, data 0xde474b/0xf26000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647aaf06c00 session 0x5647adae4b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 157982720 unmapped: 52551680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.565225601s of 10.700901031s, submitted: 30
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647ad02a000 session 0x5647adde50e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83c3000/0x0/0x1bfc00000, data 0xe0875b/0xf4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1837661 data_alloc: 218103808 data_used: 1810432
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155934720 unmapped: 54599680 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155820032 unmapped: 54714368 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83c3000/0x0/0x1bfc00000, data 0xe0875b/0xf4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1889793 data_alloc: 218103808 data_used: 8970240
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83c3000/0x0/0x1bfc00000, data 0xe0875b/0xf4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83c3000/0x0/0x1bfc00000, data 0xe0875b/0xf4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1889793 data_alloc: 218103808 data_used: 8970240
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 155852800 unmapped: 54681600 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.837846756s of 12.095314980s, submitted: 10
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b83c3000/0x0/0x1bfc00000, data 0xe0875b/0xf4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x68ef9c6), peers [0,1] op hist [0,2,2])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167870464 unmapped: 42663936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166985728 unmapped: 43548672 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018587 data_alloc: 218103808 data_used: 10489856
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018587 data_alloc: 218103808 data_used: 10489856
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018587 data_alloc: 218103808 data_used: 10489856
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018587 data_alloc: 218103808 data_used: 10489856
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2018587 data_alloc: 218103808 data_used: 10489856
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168034304 unmapped: 42500096 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168042496 unmapped: 42491904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168042496 unmapped: 42491904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168042496 unmapped: 42491904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b5eaa000/0x0/0x1bfc00000, data 0x1d7075b/0x1eb3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.732801437s of 27.949375153s, submitted: 121
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 170172416 unmapped: 40361984 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 2032838 data_alloc: 218103808 data_used: 10502144
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168067072 unmapped: 42467328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 ms_handle_reset con 0x5647ad6fc000 session 0x5647abd294a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168067072 unmapped: 42467328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 232 ms_handle_reset con 0x5647af315c00 session 0x5647b059c780
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168075264 unmapped: 42459136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168075264 unmapped: 42459136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 232 heartbeat osd_stat(store_statfs(0x1b5ca2000/0x0/0x1bfc00000, data 0x1f755b4/0x20ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 232 ms_handle_reset con 0x5647ad02c000 session 0x5647adae5e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 ms_handle_reset con 0x5647ad3ea400 session 0x5647abc20b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 168157184 unmapped: 42377216 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 ms_handle_reset con 0x5647abdf6000 session 0x5647adda7e00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 ms_handle_reset con 0x5647ac12d400 session 0x5647ac0e8d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 ms_handle_reset con 0x5647abdf6000 session 0x5647b059c1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2157123 data_alloc: 234881024 data_used: 26091520
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 187375616 unmapped: 23158784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 234 ms_handle_reset con 0x5647ad02c000 session 0x5647ade5d680
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180805632 unmapped: 29728768 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 234 handle_osd_map epochs [234,235], i have 234, src has [1,235]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 234 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647b224e000 session 0x5647ac0e8b40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 heartbeat osd_stat(store_statfs(0x1b5633000/0x0/0x1bfc00000, data 0x25e1eca/0x272a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 heartbeat osd_stat(store_statfs(0x1b562f000/0x0/0x1bfc00000, data 0x25e3b3f/0x272d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647ad735000 session 0x5647ad17b860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647abdf6000 session 0x5647aaf0de00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647ac12d400 session 0x5647abcb0d20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180944896 unmapped: 29589504 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180994048 unmapped: 29540352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647b224e000 session 0x5647ab0f61e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647ad02c000 session 0x5647ac13e1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180994048 unmapped: 29540352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 heartbeat osd_stat(store_statfs(0x1b562d000/0x0/0x1bfc00000, data 0x25e3bb1/0x272f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2150025 data_alloc: 234881024 data_used: 26095616
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 heartbeat osd_stat(store_statfs(0x1b562d000/0x0/0x1bfc00000, data 0x25e3bb1/0x272f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180994048 unmapped: 29540352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180994048 unmapped: 29540352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180994048 unmapped: 29540352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180994048 unmapped: 29540352 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 181010432 unmapped: 29523968 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 heartbeat osd_stat(store_statfs(0x1b562d000/0x0/0x1bfc00000, data 0x25e3bb1/0x272f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.574453354s of 15.184541702s, submitted: 92
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 ms_handle_reset con 0x5647aaef0000 session 0x5647adf3a1e0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 235 handle_osd_map epochs [235,236], i have 235, src has [1,236]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2152156 data_alloc: 234881024 data_used: 26112000
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180936704 unmapped: 29597696 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 236 ms_handle_reset con 0x5647aaef0000 session 0x5647ac158960
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180953088 unmapped: 29581312 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 236 heartbeat osd_stat(store_statfs(0x1b562c000/0x0/0x1bfc00000, data 0x25e56e0/0x2731000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180969472 unmapped: 29564928 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 181067776 unmapped: 29466624 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 237 ms_handle_reset con 0x5647b0e37000 session 0x5647ade703c0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 181084160 unmapped: 29450240 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 2105194 data_alloc: 234881024 data_used: 26103808
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 181084160 unmapped: 29450240 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 237 ms_handle_reset con 0x5647af314c00 session 0x5647ab821c20
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 237 heartbeat osd_stat(store_statfs(0x1b5c94000/0x0/0x1bfc00000, data 0x1f7e31b/0x20c9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 238 ms_handle_reset con 0x5647ad3eb800 session 0x5647ade5c5a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180051968 unmapped: 30482432 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 180051968 unmapped: 30482432 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 238 ms_handle_reset con 0x5647ad02b800 session 0x5647adf3be00
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 238 ms_handle_reset con 0x5647ae868400 session 0x5647ac13f860
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 171769856 unmapped: 38764544 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 238 ms_handle_reset con 0x5647aaef0000 session 0x5647abd265a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165756928 unmapped: 44777472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 238 handle_osd_map epochs [238,239], i have 238, src has [1,239]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.497013092s of 10.278826714s, submitted: 140
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 239 heartbeat osd_stat(store_statfs(0x1b7418000/0x0/0x1bfc00000, data 0x7f8b2f/0x945000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1829763 data_alloc: 218103808 data_used: 1859584
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165756928 unmapped: 44777472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 239 ms_handle_reset con 0x5647b06be800 session 0x5647abd29a40
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 239 ms_handle_reset con 0x5647b0f0e400 session 0x5647abd285a0
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165756928 unmapped: 44777472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165756928 unmapped: 44777472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165756928 unmapped: 44777472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 239 heartbeat osd_stat(store_statfs(0x1b7419000/0x0/0x1bfc00000, data 0x5f78bd/0x741000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165756928 unmapped: 44777472 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165773312 unmapped: 44761088 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165781504 unmapped: 44752896 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165789696 unmapped: 44744704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165789696 unmapped: 44744704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165789696 unmapped: 44744704 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165797888 unmapped: 44736512 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165806080 unmapped: 44728320 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165814272 unmapped: 44720128 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165822464 unmapped: 44711936 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165830656 unmapped: 44703744 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 20K writes, 73K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 20K writes, 7078 syncs, 2.87 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2124 writes, 6160 keys, 2124 commit groups, 1.0 writes per commit group, ingest: 4.65 MB, 0.01 MB/s#012Interval WAL: 2124 writes, 952 syncs, 2.23 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165838848 unmapped: 44695552 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165847040 unmapped: 44687360 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165847040 unmapped: 44687360 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165847040 unmapped: 44687360 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165847040 unmapped: 44687360 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165847040 unmapped: 44687360 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165847040 unmapped: 44687360 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165855232 unmapped: 44679168 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165863424 unmapped: 44670976 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165863424 unmapped: 44670976 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165863424 unmapped: 44670976 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b7619000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165871616 unmapped: 44662784 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1817241 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 221.851974487s of 221.905029297s, submitted: 32
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165879808 unmapped: 44654592 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 165904384 unmapped: 44630016 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167018496 unmapped: 43515904 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167026688 unmapped: 43507712 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167034880 unmapped: 43499520 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167034880 unmapped: 43499520 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167034880 unmapped: 43499520 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167034880 unmapped: 43499520 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167043072 unmapped: 43491328 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167051264 unmapped: 43483136 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167059456 unmapped: 43474944 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 167108608 unmapped: 43425792 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: bluestore.MempoolThread(0x5647a9a13b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1816361 data_alloc: 218103808 data_used: 1855488
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'config diff' '{prefix=config diff}'
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'config show' '{prefix=config show}'
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166985728 unmapped: 43548672 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x5f93fc/0x744000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x7e9f9c6), peers [0,1] op hist [])
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: prioritycache tune_memory target: 4294967296 mapped: 166748160 unmapped: 43786240 heap: 210534400 old mem: 2845415832 new mem: 2845415832
Jan 26 13:54:21 np0005596062 ceph-osd[79865]: do_command 'log dump' '{prefix=log dump}'
Jan 26 13:54:21 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 26 13:54:21 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1752056962' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 13:54:21 np0005596062 nova_compute[227313]: 2026-01-26 18:54:21.927 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 26 13:54:22 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1431424590' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 13:54:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:54:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:22.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:54:22 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 26 13:54:22 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3353467711' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 13:54:22 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:22 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:22 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:22.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 26 13:54:23 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/788607866' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 13:54:23 np0005596062 nova_compute[227313]: 2026-01-26 18:54:23.355 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:23 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/406394892' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2050119309' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 13:54:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:24.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2388152249' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 26 13:54:24 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1046011753' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 13:54:24 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:24 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:24 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:24.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2872540165' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/440987155' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1379196671' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2845126449' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 26 13:54:25 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4186090428' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1140170629' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 13:54:26 np0005596062 systemd[1]: Starting Hostname Service...
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 13:54:26 np0005596062 systemd[1]: Started Hostname Service.
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4134212791' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4165238776' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 13:54:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:26.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2442809828' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 26 13:54:26 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3940131182' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 13:54:26 np0005596062 nova_compute[227313]: 2026-01-26 18:54:26.930 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:26 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:26 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 13:54:26 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:26.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 13:54:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 26 13:54:27 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1478775918' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 13:54:27 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 26 13:54:27 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3386202874' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 13:54:27 np0005596062 podman[272576]: 2026-01-26 18:54:27.918095123 +0000 UTC m=+0.131845667 container health_status e64ced34ab39f5e6523a1fae05dd0b82e08fec779c7e537019e08bc2b1a9573b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '431ca7a7f3efd032b1aee96c3e0a533b29d789a5aae674c39b1aa51c9d150475-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231-3244a33f47fb6d1fff08a7f9fd1d0f52c647f19d23dd46a55b60ba58bfaa6231'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 13:54:28 np0005596062 nova_compute[227313]: 2026-01-26 18:54:28.357 227317 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 13:54:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:28.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 26 13:54:28 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3059537666' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 13:54:28 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:28 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:28 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:28.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 26 13:54:29 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1248309007' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 13:54:29 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 26 13:54:29 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4247769915' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 13:54:29 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.050 227317 DEBUG oslo_service.periodic_task [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.100 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.100 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.100 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.101 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.101 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/202972716' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2342754064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.559 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:54:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 13:54:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.102 - anonymous [26/Jan/2026:18:54:30.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.768 227317 WARNING nova.virt.libvirt.driver [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.770 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4391MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.770 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.771 227317 DEBUG oslo_concurrency.lockutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' 
Jan 26 13:54:30 np0005596062 ceph-mon[77178]: from='mgr.14132 192.168.122.100:0/3479430344' entity='mgr.compute-0.mbryrf' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 13:54:30 np0005596062 radosgw[83289]: ====== starting new request req=0x7fa18527e6f0 =====
Jan 26 13:54:30 np0005596062 radosgw[83289]: ====== req done req=0x7fa18527e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 13:54:30 np0005596062 radosgw[83289]: beast: 0x7fa18527e6f0: 192.168.122.100 - anonymous [26/Jan/2026:18:54:30.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.984 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 13:54:30 np0005596062 nova_compute[227313]: 2026-01-26 18:54:30.984 227317 DEBUG nova.compute.resource_tracker [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 13:54:31 np0005596062 nova_compute[227313]: 2026-01-26 18:54:31.002 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 13:54:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 26 13:54:31 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4252636109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 13:54:31 np0005596062 nova_compute[227313]: 2026-01-26 18:54:31.471 227317 DEBUG oslo_concurrency.processutils [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 13:54:31 np0005596062 nova_compute[227313]: 2026-01-26 18:54:31.477 227317 DEBUG nova.compute.provider_tree [None req-67e0c93c-36ef-449c-a609-2abd5eef22a3 - - - - - -] Inventory has not changed in ProviderTree for provider: 65600a65-69bc-488c-8c8c-71cbf43e523a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 13:54:31 np0005596062 ceph-mon[77178]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 26 13:54:31 np0005596062 ceph-mon[77178]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2041453231' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
